Skip to content

"Elf-Disclosure" for Jan 2025

January was a quiet "public-facing" month, as our various team members and developers were either focusing on holidays, or focusing on in-progress projects. We've quietly added a few new apps / providers to our stack, giving them time to "smoke test" before publically announcing them.

To get us started, here are some geeky stats for Jan 2024, followed by a summary of some of the user-facing changes announced this month in the blog...

Stats

Focus Nov 2024 Dec 2024 Jan 2025
Discord members 2219 2328 2427
YouTube subscribers 579 618 654
TikTok followers 28 28 28
X followers 88 90 93

The stats below illustrate CPU cores used (not percentage). These stats only cover the DE cluster at present, we're working on cross-cluster metrics aggregation to make this data more useful.

Tenant CPU load on average is the same as th previous month, but since the graphs now (more usefully) relect 24h of time, it's harder to see minor variations in usage over time.

CPU stats for Dec 2024

kubectl top nodes
NAME       CPU(cores)   CPU%   MEMORY(bytes)   MEMORY%
fairy01    3253m        20%    48989Mi         38%
fairy02    2219m        13%    38351Mi         29%
fairy03    3600m        22%    35117Mi         27%
gretel01   1976m        16%    31328Mi         48%
gretel02   882m         7%     20904Mi         32%
gretel07   1617m        10%    27626Mi         21%
gretel08   2241m        14%    41597Mi         32%
gretel09   1145m        7%     28556Mi         22%
gretel10   1587m        9%     27677Mi         21%
gretel11   5257m        32%    40538Mi         31%
gretel13   681m         4%     64061Mi         49%
gretel14   944m         5%     26839Mi         20%
gretel15   1624m        10%    35381Mi         27%
gretel16   1876m        11%    29543Mi         22%
gretel17   2683m        16%    40151Mi         31%
gretel19   1813m        11%    25495Mi         19%
gretel20   1739m        10%    39069Mi         30%
gretel22   756m         4%     20475Mi         15%
gretel23   791m         4%     28535Mi         22%
gretel26   1452m        9%     24623Mi         19%
gretel27   1378m        8%     19278Mi         14%
gretel30   6652m        41%    31794Mi         49%
gretel31   2038m        12%    82366Mi         63%
gretel33   2047m        12%    30128Mi         46%
gretel37   3329m        20%    27933Mi         21%
hansel01   3332m        27%    49937Mi         77%
hansel02   2432m        20%    43139Mi         67%
hansel03   1523m        19%    32627Mi         50%
hansel13   2077m        25%    45012Mi         70%
hansel14   3078m        25%    43553Mi         67%
hansel15   1652m        13%    46544Mi         72%
hansel16   2741m        22%    42101Mi         65%
hansel17   1620m        13%    30437Mi         47%
hansel18   2203m        18%    40614Mi         63%
hansel19   6146m        51%    48192Mi         75%
hansel20   1712m        14%    25911Mi         40%

Last month (Dec)'s for comparison:

CPU stats for Dec 2024

kubectl top nodes
NAME       CPU(cores)   CPU%   MEMORY(bytes)   MEMORY%
fairy01    2937m        18%    76426Mi         59%
fairy02    1869m        11%    55905Mi         43%
fairy03    701m         4%     79816Mi         62%
gretel07   1011m        6%     44513Mi         34%
gretel08   3809m        23%    56575Mi         43%
gretel09   2334m        14%    52190Mi         40%
gretel10   605m         3%     37128Mi         28%
gretel11   2564m        16%    51529Mi         40%
gretel13   693m         4%     86985Mi         67%
gretel14   1566m        9%     47162Mi         36%
gretel15   1870m        11%    45069Mi         35%
gretel16   2009m        12%    42307Mi         32%
gretel17   2758m        17%    51826Mi         40%
gretel19   613m         3%     42791Mi         33%
gretel20   2339m        14%    46208Mi         35%
gretel22   654m         4%     39190Mi         30%
gretel23   2001m        12%    54414Mi         42%
gretel24   1264m        7%     26617Mi         20%
gretel25   541m         3%     43809Mi         34%
gretel26   914m         5%     41056Mi         31%
gretel27   3908m        24%    38143Mi         29%
gretel29   1371m        8%     51807Mi         40%
gretel30   623m         3%     18808Mi         29%
gretel31   3176m        19%    54015Mi         41%
gretel33   841m         5%     28607Mi         44%
gretel37   3779m        23%    63097Mi         49%
hansel01   3718m        30%    50790Mi         79%
hansel02   558m         4%     27102Mi         42%
hansel03   1065m        13%    23775Mi         37%
hansel13   730m         9%     20304Mi         31%
hansel14   3016m        25%    49720Mi         77%
hansel15   3220m        26%    52838Mi         82%
hansel16   1362m        11%    32593Mi         50%
hansel17   674m         5%     26916Mi         41%
hansel18   566m         4%     21870Mi         34%
hansel19   2093m        17%    53533Mi         83%
hansel20   1968m        16%    56627Mi         88%

This graph represents memory usage across the entire (DE) cluster. Tenant memory is relatively stable and consistent with the previous month.

Other high consumers of RAM:

  • csi-rclone: used for mounting all rclone-compatible storage mounts, primarily RealDebrid libraries
  • kube-system: the Kubernetes control plane, including the cilium agents which manage the networking / policy enforcement (currently 11K flows/s across 30 nodes)
  • traefik: all inbound access to the cluster / services
  • mediafusion: an excellent (but RAM-hungry!) Stremio addon

Memory stats for Jan 2025

kubectl top nodes
NAME       CPU(cores)   CPU%   MEMORY(bytes)   MEMORY%
fairy01    3253m        20%    48989Mi         38%
fairy02    2219m        13%    38351Mi         29%
fairy03    3600m        22%    35117Mi         27%
gretel01   1976m        16%    31328Mi         48%
gretel02   882m         7%     20904Mi         32%
gretel07   1617m        10%    27626Mi         21%
gretel08   2241m        14%    41597Mi         32%
gretel09   1145m        7%     28556Mi         22%
gretel10   1587m        9%     27677Mi         21%
gretel11   5257m        32%    40538Mi         31%
gretel13   681m         4%     64061Mi         49%
gretel14   944m         5%     26839Mi         20%
gretel15   1624m        10%    35381Mi         27%
gretel16   1876m        11%    29543Mi         22%
gretel17   2683m        16%    40151Mi         31%
gretel19   1813m        11%    25495Mi         19%
gretel20   1739m        10%    39069Mi         30%
gretel22   756m         4%     20475Mi         15%
gretel23   791m         4%     28535Mi         22%
gretel26   1452m        9%     24623Mi         19%
gretel27   1378m        8%     19278Mi         14%
gretel30   6652m        41%    31794Mi         49%
gretel31   2038m        12%    82366Mi         63%
gretel33   2047m        12%    30128Mi         46%
gretel37   3329m        20%    27933Mi         21%
hansel01   3332m        27%    49937Mi         77%
hansel02   2432m        20%    43139Mi         67%
hansel03   1523m        19%    32627Mi         50%
hansel13   2077m        25%    45012Mi         70%
hansel14   3078m        25%    43553Mi         67%
hansel15   1652m        13%    46544Mi         72%
hansel16   2741m        22%    42101Mi         65%
hansel17   1620m        13%    30437Mi         47%
hansel18   2203m        18%    40614Mi         63%
hansel19   6146m        51%    48192Mi         75%
hansel20   1712m        14%    25911Mi         40%

Last month (Dec 2024)'s for comparison:

Memory stats for Dec 2024

kubectl top nodes
NAME       CPU(cores)   CPU%   MEMORY(bytes)   MEMORY%
fairy01    2937m        18%    76426Mi         59%
fairy02    1869m        11%    55905Mi         43%
fairy03    701m         4%     79816Mi         62%
gretel07   1011m        6%     44513Mi         34%
gretel08   3809m        23%    56575Mi         43%
gretel09   2334m        14%    52190Mi         40%
gretel10   605m         3%     37128Mi         28%
gretel11   2564m        16%    51529Mi         40%
gretel13   693m         4%     86985Mi         67%
gretel14   1566m        9%     47162Mi         36%
gretel15   1870m        11%    45069Mi         35%
gretel16   2009m        12%    42307Mi         32%
gretel17   2758m        17%    51826Mi         40%
gretel19   613m         3%     42791Mi         33%
gretel20   2339m        14%    46208Mi         35%
gretel22   654m         4%     39190Mi         30%
gretel23   2001m        12%    54414Mi         42%
gretel24   1264m        7%     26617Mi         20%
gretel25   541m         3%     43809Mi         34%
gretel26   914m         5%     41056Mi         31%
gretel27   3908m        24%    38143Mi         29%
gretel29   1371m        8%     51807Mi         40%
gretel30   623m         3%     18808Mi         29%
gretel31   3176m        19%    54015Mi         41%
gretel33   841m         5%     28607Mi         44%
gretel37   3779m        23%    63097Mi         49%
hansel01   3718m        30%    50790Mi         79%
hansel02   558m         4%     27102Mi         42%
hansel03   1065m        13%    23775Mi         37%
hansel13   730m         9%     20304Mi         31%
hansel14   3016m        25%    49720Mi         77%
hansel15   3220m        26%    52838Mi         82%
hansel16   1362m        11%    32593Mi         50%
hansel17   674m         5%     26916Mi         41%
hansel18   566m         4%     21870Mi         34%
hansel19   2093m        17%    53533Mi         83%
hansel20   1968m        16%    56627Mi         88%

Last month's spikes on the contended nodes (hansels) turned out to be related to in-cluster backups, rather than tenant-driven load, and this misconfiguration was resolved. Hansel and Gretel traffic patterns are now more aligned to what you'd expect, comparing December to November:

Why Hansel & Gretel?

Bundles are datacenter-agnostic, but nodes are specific to each datacenter, and we needed a way to differentiate US nodes from DE nodes. The fairy tale of Hansel and Gretel originates in Germany 🇩🇪

Network traffic for Jan 2025 (*hansels*)

Network traffic for Jan 2025 (*gretels)

Last month (Dec 2024)'s for comparison:

Network traffic for Dec 2024 (*hansels*)

Network traffic for Dec 2024 (*gretels)

Retrospective

January saw changes in debrid-land, with TorBox pivoting away from Plex users to becoming more of a Stremio-only debrid provider, Riven and Comet slowing/stalling their development, and a new addon (AIOStreams) and infinite-library-building app (cli_debrid) being released.

Mooar apps

The following apps made their debut on ElfHosted during Jan 2024:

AIOStreams

AIOStreams is a clever new Stremio addon by Viren, author of one of the most popular guides linked to r/StremioAddons.

AIOStreams combines the search results from other addons, sorting / filtering them nicely and presenting them to you in one of several beautiful, parsable formats.

Two "killer features" for AIOStreams are:

  1. The inclusion of torrentio results (more on that below)
  2. The support of MediaFlow Proxy, allowing any results from any addon to be proxy-streamed through your MediaFlow Proxy instance (avoiding RD IP bans and the like)

A few days after an ElfHosted-sponsored public AIOStreams instance was deployed, torrentio started blocking all VPS / server ranges entirely. It may have been co-incidental timing, but the change made it clear that torrentio doesn't want automated scraping from VPS/server ranges (quite reasonably, since one VPS can probably generate 1000x the load that an average Stremio user can).

As a result, we disabled torrentio scraping on the public AIOStreams instance, but continue to support it via private instances, on the basis that there is user accountability in the case of abuse.

There's a dedicated #elf-aiostreams channel in Discord to discuss the addon, and for direct-to-developer interaction.

CWA Automated Book Downloader

Calibre Web Automated Downloader is an app intended for use alongside Calibre-Web Automated, whose role it is to search and download ebooks from online web sources. It augments the existing OpenBooks app, which does a similar thing, from IRC sources.

More details in this blog post

CLI Debrid

cli_debrid is a new project inspired by the original plex_debrid, and is currently very actively developed by godver3.

Ironically, the first thing you'll notice about cli_debrid is its web UI. The UI handles all the settings / config / scraping options (farewell, annoying plex_debrid menu structure!), and the app itself still runs in a CLI within the UI.

We have cli_debrid versions of all of our bundles, and a dedicated #elf-cli-debrid channel in Discord for your direct-to-developer access.

godver3 is also an Elf-illiated developer (see more below), so 33% of your cli_debrid subscription funds his ongoing development!

CineSync

A challenge with offering non-zurg-based Debrid storage mounts has been how to separate movies from TV shows, so that Plex has different libraries to scan for different types of content.

We have bespoke options for various debrid providers (Zurg makes this easy enough for RealDebrid, DebriDav does the same with the Aaars for Premiumize, and DavDebrid for DebridLink), but running multiple providers at the same time can get messy, and AllDebrid doesn't work for us with DavDebrid due to API restrictions.

CineSync is a standalone project which will look at multiple source locations (think RealDebrid combined with AllDebrid!), sort the contents based on regexes and TMDB lookups, and create a separate set of structured symlinks sorted by content type, resolution, etc. These symlinks are maintained by CineSync (so they're deleted if the source is deleted), and CineSync is able to (with a Plex token) tell Plex to refresh a library when new content is added.

A few users have been trialling CineSync for the last month or so - it can be too slow on large libraries (development is ongoing in that area), but it's also super-handy for the Zurgling design, when users want a better-sorted view of their content than the mess of inconsistently-named folders that Zurg reflects from RD.

DebriDav

DebriDav is sort of like a combination of Zurg and RDTClient. It both simulates a qBittorrent API (for the Aars to add downloads), and a WebDAV server (for rclone mounts), and is an ingenious workaround for the 1TB "cloud storage" limit on Premiumize.

DebriDav works not by adding cached items to your library (which would hit the 1TB limit), but simply storing its own "placeholder" file representing items known to be cached, and serving that via WebDAV as if it were the full file.

This fake-file trickery means that (as with symlinks), the file the user "sees" can be renamed, moved, etc, but it's only when the file is attempted to be played, that the actual content is streamed from Premiumize. Since the file is not actually added to your Premiumize, just played directly from the cache (like when you use Stremio with Premiumize), it doesn't incur cloud-storage costs. Premiumize "fair use" points are still required to stream from the cache.

We have a few enthusiastic users testing the Premiumize implementation, on the understanding that it's new-and-may-break, and a recent release has also enabled EasyNews support (currently entirely undocumented on our end, but hit me up if you'd like to test).

Once the existing trials have had a little more time to bed in (and especially EasyNews users), we'll do a bigger announcement.

Torrentio blocks servers

As noted in the AIOStreams introduction above, torrentio no longer permits access from server ranges, presumably to prevent abuse and to reserve their (freely available) resources for the widest range of standard Stremio users.

Respecting this position, while the VPNs on our Stremio addons avoid this ban for private instances, we are not going to attempt a VPN-based workaround for Plex-library-building apps such as Prowlarr, plex_debrid, etc.

ElfHosted's super-charged internal Zilean instance, combined with our internal (un-ratelimted) MediaFusion access, provide very good coverage of cached content already, and users can add a range of external scrapers and indexers to their tools, without imposing unfairly upon torrentio.

Stremio Addons SSO

Initially we made Stremio addons fully public (they need to be public for Stremio to use them, since it doesn't support auth), but users raised valid concerns about the security of their API credentials when sharing their addon with friends and family.

In early Jan, we added SSO support to the /configure pages of the addons, such that in order to configure an addon, you'll need to be signed into your ElfHosted account. The addon can be used (and shared) without SSO though, a compromise between security and practicality.

Additionally, some addons (AIOStreams and MediaFusion currently) support user-managed encryption schemes, so that a shared addon URL doesn't expose any sensitive data.

More details in this blog post.

Comet cools

The Comet Stremio Addon has been semi-functional since the debrid providers withdrew their cache endpoint checks in November. With the help of StremThru, we've restored some ability to work with RD/AD/DL, but development on the "rewrite" branch to properly address this, is currently on hold.

If Comet is working for you currently, then continue to use it, but if you're having issues, we should note that they're unlikely to be fixed in the short term, and we'd suggest transitioning to MediaFusion, which now also has Jackett support (for searching your own indexers), and the ability to proxy-stream using MediaFlow Proxy.

Open an #elf-help ticket if you'd like advice re transitioning.

TorBox turns

Torbox has made it clear that they're not to be used for "infinite library" setups, to the point that cached items in your library will be removed after 30 days of inactivity, and that efforts to circumvent this expiry are classified as "abuse".

TorBox also requested Riven remove their TorBox integration, since this was observed to be a common vector for automated abuse, so as of the next published release of Riven, TorBox support will be removed (there are no longer any ElfHosted Riven+TorBox users).

Riven recedes

On the heels of the TorBox change, Riven's development team took a 2-month hiatus during Dec/Jan, to rest and recuperate, and many user issues / bugs went unanswered / unfixed, causing some users to question whether Riven was abandoned.

The development team has re-organised and brought on another front-end developer and project-management skills, but has made it clear that Riven is a passion project, to be progressed as and when the developers have the time and the enthusiasm.

For ElfHosted users, this means we can no longer reasonably expect Elf-specific bugfixes / features "at pace", and users who find themselves "stuck" are advised to transition to the classic Aars stack, or try out the new-but-energetically-developed cli_debrid.

Open an #elf-help ticket if you'd like advice re transitioning.

Premiumize performs

Now that we have a workable way to integrate Premiumize as a debrid provider into Plex libraries, without the 1TB cloud-storage limit, we intend to polish up our Premiumize-integrated offering.

We're pursuing a special deal with Premiumize (TBD), but in the short-term, we've been given a collection of 14-day trial vouchers for Premiumize, for Plex users who want to try it out.

More details will follow once DebriDav settles, and if a special Premiumize deal can be reached, but open an #elf-help ticket in the meantime to request your Premiumize trial voucher!

US Cluster speed fixes

Due to the gradual way we grew out the US cluster, we ended up in a situation where our cluster CNI was using VXLAN tunnels between publically addressed nodes (on different subnets) to provide our network overlay. While not wrong, this is inefficient and hard to debug.

Additionally, the integration of tailscale on a node level seemed to cause unwanted interaction, "amplifying" tunneled traffic between all the cluster nodes, and effectively creating a hard-to-debug intermittent speed problem which only affected some of the users, on some of the nodes, some of the time.

There are some technical limitations to the hosting configuration which led to this setup, but during a few glowups towards the end of January, we worked around these to get rid of the tunnels (and the tailscales), and as a result we now have optimal speeds in the US cluster again.

We've recently been quiet about encouraging migrations from DE to US until the speed issue was resolved, but now that it's been put to bed, we'll be suggesting that US users test their speed at https://speed.elfhosted.com, and migrate their stack if it's significantly to their advantage.

Coming up

US East Coast DC

Yes. I know. We talk about this every month. The latest from CrunchBits as of 30 Jan is that they're waiting for their provider to make space for new gear, and our order (placed months ago but not paid yet) is a priority.

Watch this space (as usual).

Elf-illiate program trial

We have an automated referral program, making it possible for happy Elfies to "pay it forward" by referring us to friends, and receiving a $10 credit against their account if this referral converts.

During January, we transitioned our (previously manual and dumb) developer contributions system to a more advanced "Elf-illiate" program, which allows us to calculate and pay out our participating open-source developers for a portion of subscription fees.

Unlike the referral program, the affiliate program requires approval, but pays out directly in cash, as opposed to account credit. The idea is to encourage users with an "audience" (elf-influencers?) to spread the word about ElfHosted, in return for a percentage of commission on sales.

We'll cover this in a detailed blog post during February, but if you're interested in helping with an initial trial, start your Elf-illiate application here.

Even mooar apps

Apps currently requested can be found (and submitted!) here

Notable suggestions:

ElfGuides (ongoing)

We've made videos about how to drive our most popular setups, but given the tools and apps change so fast, the videos very often become out-of-date. Re-recording a video simply to address a change a single tool in a larger workflow can be tedious and time-consuming, so we've been exploring another option.

The "ElfGuides" are a collection of ScribeHow documents, assembled modularly from a collection of "Scribes" (screenshot-driven guides), which can be mix/matched up to provide a detailed guide per-stack (there are more than 30 variations now!). When a tool in the stack changes, updating the guides is just a matter of updating the individual "module" covering that tool.

If you've been a long-time Elfie, you'll not have seen any guides, but they're emailed to new subscribers as they start their subscription!

The most popular app stacks are covered in the ElfGuides currently, but given the variety / rate of change we face, the effort to maintain these is... ongoing.

Your ideas?

Got ideas for improvements? Send us an EEP (ElfHosted Enhancement Proposal) here!

How to help

Another effective way to help is to drive traffic / organic discovery, by mentioning ElfHosted in appropriate forums such as Reddit's r/plex, r/realdebrid, and r/StremioAddons, which is where much of our target audience is to be found!

Join us!

Want to get involved?

Want to get involved? Join us in Discord!

What users say..

Here's what some of our usersfriends say..

I am new here, but today I learned realized that Elfhosted is one of the best free and open source software communities I've seen, and FOSS communities have been at the center of my life since the 90s (Perl, PHP, Symfony, Drupal, Ethereum, etc.). Great open software built by great people who care = great community, and that is something special.

You've done an amazing job @Funky Penguelf with the platform you provide and this place has an awesome mix of active community caretakers and software creators that I've seen here so far like BSM, Spoked, LayeZee and other elf vengers. Keep up the energy, productivity and community and take time to enjoy it and appreciate each other!

⭐⭐⭐⭐⭐ @skwah (Discord)

I self host and share a fully automated ‘arr stack with Plex. Been doing so for around 4 years. Also recently got into real debrid and hosting a Comet and Annatar for Stremio. The amount of time and head banging I’ve put into it is in the hundreds to thousands of hours. From setting it up to keeping it running smoothly. Let’s not forget the cost of my server and how much it cost to keep it running.

Anyway I wanted to see what ElfHosted was about to compare. Yeah I had the whole thing setup in just a few hours. It also passes the headache of maintaining it to ElfHosted. Will I keep it no because nerdy things and maintaining my server are my hobby and quirky passion project. Will I recommend it to my friends who don’t have the money up front to buy a server, the knowledge to maintain it or desire.

Just my server alone was $2k. Power cost to keep it on yearly is $250ish, annual memberships to RD, Usenet and indexers are around $100. Then whatever a value my free time at. Which is currently at minimum my hourly pay at work or more. Yeah so take the monthly cost of all that and compare to ElfHosted Ultimate Stream package at $39 monthly, add RD to the cost and get nearly all your time back is incredibly cheap.

Lastly it seems like a lot of people forget how quickly an ultimate cable package used to cost. Or how quick paying for every stream service would add up to. Which when using ElfHosted with RD is essentially and more what you get. Quick hint it’s far above the asking price.

⭐⭐⭐⭐⭐ /u/MMag05. (Reddit)

As a happy Elfhosted customer—who also self hosts MANY things across about 10 severs (dedicated, VPSes, and VMs running on Synology), I wouldn’t switch to self hosting the services I get from Elfhosted. They just work with very little effort configuring things, and the support the owner and his team provides is second to none. Plus I love being part of a fledgling—but quickly growing—enterprise.

⭐⭐⭐⭐⭐ /u/jatguy. (Reddit)

I recently found ElfHosted and decided to start out with the Infinite Starter Kit. Within a week I realized that this was for me and upgraded to the Hobbit plan. Give it another week and I was up to the Ranger plan.

I just love the simplicity and the fact that things just work. For years I've ran a home server and between the constant maintenance and always upgrading harddrives, it became apparent I wanted to make it easier on my self. Enter ElfHosted.

Setup was super easy with the guided documentation and the discord community. It seems that somebody is available at all hours of the day to help with questions. I started with the Aars, which I knew from my prior hosting... but saw a newer product called Riven. I decided to jump in feet first. I enjoy being on the front end of an up and coming replacement for the Aars and will soon be upgrading to the annual plan!

⭐⭐⭐⭐⭐ @.theycallmebob. (Discord)

I’ve been using this service for a while now, and honestly, it’s a game-changer compared to anything else I’ve tried for managing my media library. The support is fantastic—super quick, and if the staff aren’t around (which rarely happens), the community steps up right away. I can’t imagine going back to any other platform.

Before this, I had my own setup with a NUC, NAS, and tools like Sonarr and Radarr. It worked pretty well for a while, and my internet speed was high enough to stream without any buffering. But in the end, it wasn’t worth the time or headache of managing all the storage and keeping everything running smoothly.

Now, with this service, everything runs smoothly in 1080p+ with no buffering issues. The interface is really easy to use, which makes managing everything a breeze. Plus, having a whole community of smart people available for guidance is a huge bonus.

I was sold from the start, which is why I quickly upgraded from a 1-month to a 3-month subscription, and I’m planning to switch to a 1-year plan soon. This service totally pays for itself, and I’m sure you won’t be disappointed. It’s been really impressive.

⭐⭐⭐⭐⭐ @seapound (Discord)

Best possible options for anyone looking for the do-it-all option along with the best customer service ive experienced in this space so far. Id rate it a 6 if I could but its limited to 5/5...

⭐⭐⭐⭐⭐ @hashmelters (Discord)

(responding to a Reddit thread re the cost of ElfHosted vs mainstream streaming / self-hosting):

I didn't know that the goal of this project was to compete with large companies running/renting entire DCs. I was under the impression that the goal of this project was to manage the updating of almost selfhosted applications on a shared platform with other users. Basically, be my sysadmin for me.

That being said, paying for services is the 'easy button'. There is a real world cost incurred for the time saved. Time is money. Time is the most valuable currency that exists. Once time is spent, it's forever lost, one cannot retrieve it again (yet). In my mind, there are 3 options for use of time with respect to: mainstream, selfhosting, elfhosted.

  • mainstream - my time is valuable and I don't want curated content and I don't care what content that I have the ability to consume. I only like what's popular.

  • elfhosted - my time is valuable, I want my own curated content without being forced to browse past the same damn entry 500 times just to find out that I can't watch the movie I want because it's not available in my current location or was removed last week from mainstream providers.

  • selfhost - I care about costs and I have nothing but time to waste or I want to learn about the backend of the systems involved. I'll pay for my own VPS/homelab, electricity, manage the OS, manage app updates, figure out how to make the apps talk nice to each other, create my own beautiful frontend.

I know how much my time is worth, does that reddit poster know how much their time is worth? Without knowing what you are worth, you can't make effective capital expenditures with respect to the time it will take to recoup the capital.

I know I don't need elfhosted at all for my use case. I choose to stay with elfhosted because it's my 'easy button'. It's an efficient capital expense for the amount of time it saves me managing my own hardware, apps and saves me electricity costs. I'm also in a situation where I don't have upload bandwidth from my home to serve HD content to myself remotely. If I lived back in a city, I would still be here. My time is worth $$/hr.

⭐⭐⭐⭐⭐ @cobra2 (Discord)

"Just wanted to check in here and let @Darth-Penguini and anyone/everyone else know...WOW. I have been struggling with storage for years, maintenance of Docker containers, upkeep, all of it. Elfhosted is so freeing. It's an amazing service that I hope to be a member of for a long, long time!"

⭐⭐⭐⭐⭐ @Fingers91 (Discord)

"I just have to say, I am an incredibly satisfied customer. I had been collecting my own content for nearly 20 years. Starting off with just a simple external HD before eventually graduating to a seedbox with 100TB of cloud storage attached and fully automated processes with Sonarr and Radarr . However, the time came when the glory days of unlimited Google Drive storage ended. I thought my days of having my full collection at my fingertips via :plex: were behind me, until I found Real-Debrid and ElfHosted.

Now I essentially have the exact same access to content as I had before, but even better. Superior support and community involvement. Content is available almost immediately after being identified. A plethora of tools at my fingertips that give me more control and automation than ever before. Wonderfully well done and impressive! I am looking forward to being a customer for a very long time! Massive kudos to @funkypenguin 🤟

⭐⭐⭐⭐⭐ @BSM (Discord)

"I would recommend ElfHosted to anyone. It has been great so far and made life a lot easier than running my own setups. If you’re in the fence give them a try and help support this great community."

⭐⭐⭐⭐⭐ Zestyclose_Teacher20 (Reddit)

"thanks for the help and must say this is the best host I every had for my server 🙂 10/10 🙂 All other places I have try have I got a lot buff etc. Your host can even give me full power on a 4K Remux on 200GB big movie file . That's damn awesome 😄"

⭐⭐⭐⭐⭐ @tjelite (Discord)

"What an amazing support system these guys have Chris and Layzee i think it was! Both are very patient with me even though I am a newbie at all this. Very thorough and explained everything step by step with me

I couldn’t ask for anything better than the service I have received by these guys! Happy happy client❤️"

⭐⭐⭐⭐⭐ @dead.potahto (Discord)

"Very happy customer. Great service"

⭐⭐⭐⭐⭐ @ronney67 (Discord)

"Very good customer service, frequent updates, and excelent uptime!!!!!"

⭐⭐⭐⭐⭐ @ed.guim (Discord)

"I had my own plex-arrs setup on hetzner for years. Yesterday I deleted everything as elfhosted has gone above and beyond it. And it has a fantastic, active community as well! Very friendly, helpful and like-minded folks always willing to help and improve the system. Top notch!"

⭐⭐⭐⭐⭐ @alon.hearter (Discord)

"Absolutely Amazed with the patience and professionalism of all Elf-Venger Staff including bossman penguin❤️"

⭐⭐⭐⭐⭐ @dead.potahto (Discord)

"@BSM went above and beyond to make sure I had all the one on one support needed with my sub. Thank you for your patience! Elfhosted continues to be Elftastic !!"

⭐⭐⭐⭐⭐ @bfmc1 (Discord)

"really enjoying the service from elfhosted. The setup is really easy from the guides on the website. And the help on the discord channel is really quick."

⭐⭐⭐⭐⭐ @jrhd13 (Discord)

"Support is amazing, and once you find a setup which works best for you it works perfectly, very happy 😊"

⭐⭐⭐⭐⭐ @fiendclub (Discord)

"great fast service, resolved my problem and really friendly"

⭐⭐⭐⭐⭐ @allan.st.minimum (Discord)

"Great service and sorted out a billing issue super quick and easy."

⭐⭐⭐⭐⭐ @scottcall707 (Discord)

"Very friendly support, resolved a problem with my account! I also appreciate the community that has been built around the service!"

⭐⭐⭐⭐⭐ @leo1305 (Discord)

"excellent customer service and very fast replies"

⭐⭐⭐⭐⭐ @yo.hohoho (Discord)

"Loved the simplicity, experience and support"

⭐⭐⭐⭐⭐ @y.adhish (Discord)

"Very friendly help as always, problem solved, one happy elf here!"

⭐⭐⭐⭐⭐ @badfurday0 (Discord)

"Great Helpful and Fast support. Thanks!"

⭐⭐⭐⭐⭐ @.mxrcy (Discord)