Skip to content

"Elf-Disclosure" for Apr 2025

During April, we made multiple, significant performance and functional improvements to the Aarr stacks, saw further features and updates applied to favorite apps like Pulsarr and SeerrBridge, and braced ourselves for the Plex-remote-streaming-pocalypse, which... hasn't really arrived.

To get us started, here are some geeky stats for Apr 2025, followed by a summary of some of the user-facing changes announced this month in the blog...

Stats

Focus Feb 2025 Mar 2025 Apr 2025
Discord members 2495 2572 2666
YouTube subscribers 678 694 735
TikTok followers 28 27 27
X followers 96 98 102
BlueSky followers - 6 6
Fediverse followers - 1 1

The stats below illustrate CPU cores used (not percentage). These stats only cover the DE cluster at present, we're working on cross-cluster metrics aggregation to make this data more useful.

Tenant CPU load on average is 5-10% higher than the preceding period, but overall CPU usage is lower due to some fixes / improvements in the supporting infrastructure components.

CPU stats for Apr 2025

kubectl top nodes
NAME       CPU(cores)   CPU(%)   MEMORY(bytes)   MEMORY(%)
fairy01    580m         3%       27593Mi         21%
fairy02    1954m        12%      53447Mi         41%
fairy03    3808m        23%      51922Mi         40%
gretel01   1742m        14%      25316Mi         39%
gretel02   2505m        20%      42766Mi         66%
gretel03   222m         1%       13206Mi         20%
gretel04   1208m        10%      45430Mi         70%
gretel05   1083m        9%       31904Mi         49%
gretel07   1638m        10%      25942Mi         20%
gretel08   1089m        6%       29087Mi         22%
gretel09   441m         2%       26763Mi         20%
gretel10   942m         5%       24987Mi         19%
gretel11   2762m        17%      37517Mi         29%
gretel13   1911m        11%      20736Mi         16%
gretel14   1275m        7%       38733Mi         30%
gretel15   836m         5%       32720Mi         25%
gretel16   2725m        17%      35253Mi         27%
gretel17   813m         5%       37057Mi         28%
gretel19   2336m        14%      30708Mi         23%
gretel20   2230m        13%      28087Mi         21%
gretel22   965m         6%       20195Mi         15%
gretel23   725m         4%       31945Mi         24%
gretel26   1262m        7%       25887Mi         20%
gretel27   1248m        7%       16423Mi         12%
gretel30   2775m        17%      36210Mi         56%
gretel31   1993m        12%      36189Mi         28%
gretel33   791m         4%       23564Mi         36%
gretel37   1585m        9%       22062Mi         17%
hansel01   1152m        9%       24158Mi         37%
hansel02   1296m        10%      28245Mi         44%
hansel04   928m         7%       23501Mi         36%
hansel05   1705m        14%      23604Mi         36%
hansel06   1277m        10%      26652Mi         41%
hansel07   2821m        23%      25423Mi         39%
hansel08   1056m        8%       25979Mi         40%
hansel14   1447m        12%      37012Mi         57%
hansel15   3867m        32%      47671Mi         74%
hansel16   889m         7%       23476Mi         36%
hansel17   1143m        9%       30925Mi         48%
hansel18   1042m        8%       24270Mi         37%
hansel20   1028m        8%       25540Mi         39%

Last month (Mar)'s for comparison:

CPU stats for Mar 2025

kubectl top nodes
NAME       CPU(cores)   CPU(%)   MEMORY(bytes)   MEMORY(%)
fairy01    657m         4%       25041Mi         19%
fairy02    2033m        12%      54269Mi         42%
fairy03    1788m        11%      40922Mi         31%
gretel01   2971m        24%      28806Mi         44%
gretel02   337m         2%       15937Mi         24%
gretel03   145m         1%       9942Mi          15%
gretel04   801m         6%       23121Mi         36%
gretel07   360m         2%       21299Mi         16%
gretel08   721m         4%       32298Mi         25%
gretel09   459m         2%       21648Mi         16%
gretel10   4242m        26%      58585Mi         45%
gretel11   1210m        7%       32423Mi         25%
gretel13   3241m        20%      30307Mi         23%
gretel14   7668m        47%      40425Mi         31%
gretel15   2799m        17%      29300Mi         22%
gretel16   1859m        11%      22386Mi         17%
gretel17   3755m        23%      32923Mi         25%
gretel19   2537m        15%      80640Mi         62%
gretel20   1937m        12%      28771Mi         22%
gretel22   898m         5%       25351Mi         19%
gretel23   2151m        13%      37010Mi         28%
gretel26   2211m        13%      25469Mi         19%
gretel27   2321m        14%      25358Mi         19%
gretel30   987m         6%       23808Mi         37%
gretel31   607m         3%       25442Mi         19%
gretel33   2340m        14%      20301Mi         31%
gretel37   3997m        24%      45488Mi         35%
hansel01   1104m        9%       27231Mi         42%
hansel02   2191m        18%      23370Mi         36%
hansel04   713m         5%       27569Mi         42%
hansel05   1446m        12%      17211Mi         26%
hansel06   1651m        13%      23849Mi         37%
hansel07   2341m        19%      25620Mi         39%
hansel08   1369m        11%      22478Mi         35%
hansel14   1909m        15%      27661Mi         43%
hansel15   1249m        10%      43481Mi         67%
hansel16   751m         6%       19163Mi         29%
hansel17   705m         5%       19594Mi         30%
hansel18   974m         8%       19538Mi         30%
hansel20   1383m        11%      25043Mi         39%

This graph represents memory usage across the entire (DE) cluster. Tenant RAM usage has increased by ~20%, while CPU usage only increased 5-10%, which may be a result of the Aarr SQLite-to-PostgreSQL migration detailed below.

Other high consumers of RAM:

  • csi-rclone: used for mounting all rclone-compatible storage mounts, primarily RealDebrid libraries
  • kube-system: the Kubernetes control plane, including the cilium agents which manage the networking / policy enforcement (currently 11K flows/s across 30 nodes)
  • traefik: all inbound access to the cluster / services
  • mediafusion: an excellent (but RAM-hungry!) Stremio addon

Memory stats for Apr 2025

kubectl top nodes
NAME       CPU(cores)   CPU(%)   MEMORY(bytes)   MEMORY(%)
fairy01    580m         3%       27593Mi         21%
fairy02    1954m        12%      53447Mi         41%
fairy03    3808m        23%      51922Mi         40%
gretel01   1742m        14%      25316Mi         39%
gretel02   2505m        20%      42766Mi         66%
gretel03   222m         1%       13206Mi         20%
gretel04   1208m        10%      45430Mi         70%
gretel05   1083m        9%       31904Mi         49%
gretel07   1638m        10%      25942Mi         20%
gretel08   1089m        6%       29087Mi         22%
gretel09   441m         2%       26763Mi         20%
gretel10   942m         5%       24987Mi         19%
gretel11   2762m        17%      37517Mi         29%
gretel13   1911m        11%      20736Mi         16%
gretel14   1275m        7%       38733Mi         30%
gretel15   836m         5%       32720Mi         25%
gretel16   2725m        17%      35253Mi         27%
gretel17   813m         5%       37057Mi         28%
gretel19   2336m        14%      30708Mi         23%
gretel20   2230m        13%      28087Mi         21%
gretel22   965m         6%       20195Mi         15%
gretel23   725m         4%       31945Mi         24%
gretel26   1262m        7%       25887Mi         20%
gretel27   1248m        7%       16423Mi         12%
gretel30   2775m        17%      36210Mi         56%
gretel31   1993m        12%      36189Mi         28%
gretel33   791m         4%       23564Mi         36%
gretel37   1585m        9%       22062Mi         17%
hansel01   1152m        9%       24158Mi         37%
hansel02   1296m        10%      28245Mi         44%
hansel04   928m         7%       23501Mi         36%
hansel05   1705m        14%      23604Mi         36%
hansel06   1277m        10%      26652Mi         41%
hansel07   2821m        23%      25423Mi         39%
hansel08   1056m        8%       25979Mi         40%
hansel14   1447m        12%      37012Mi         57%
hansel15   3867m        32%      47671Mi         74%
hansel16   889m         7%       23476Mi         36%
hansel17   1143m        9%       30925Mi         48%
hansel18   1042m        8%       24270Mi         37%
hansel20   1028m        8%       25540Mi         39%

Last month (Mar 2025)'s for comparison:

Memory stats for Mar 2025

kubectl top nodes
NAME       CPU(cores)   CPU(%)   MEMORY(bytes)   MEMORY(%)
fairy01    657m         4%       25041Mi         19%
fairy02    2033m        12%      54269Mi         42%
fairy03    1788m        11%      40922Mi         31%
gretel01   2971m        24%      28806Mi         44%
gretel02   337m         2%       15937Mi         24%
gretel03   145m         1%       9942Mi          15%
gretel04   801m         6%       23121Mi         36%
gretel07   360m         2%       21299Mi         16%
gretel08   721m         4%       32298Mi         25%
gretel09   459m         2%       21648Mi         16%
gretel10   4242m        26%      58585Mi         45%
gretel11   1210m        7%       32423Mi         25%
gretel13   3241m        20%      30307Mi         23%
gretel14   7668m        47%      40425Mi         31%
gretel15   2799m        17%      29300Mi         22%
gretel16   1859m        11%      22386Mi         17%
gretel17   3755m        23%      32923Mi         25%
gretel19   2537m        15%      80640Mi         62%
gretel20   1937m        12%      28771Mi         22%
gretel22   898m         5%       25351Mi         19%
gretel23   2151m        13%      37010Mi         28%
gretel26   2211m        13%      25469Mi         19%
gretel27   2321m        14%      25358Mi         19%
gretel30   987m         6%       23808Mi         37%
gretel31   607m         3%       25442Mi         19%
gretel33   2340m        14%      20301Mi         31%
gretel37   3997m        24%      45488Mi         35%
hansel01   1104m        9%       27231Mi         42%
hansel02   2191m        18%      23370Mi         36%
hansel04   713m         5%       27569Mi         42%
hansel05   1446m        12%      17211Mi         26%
hansel06   1651m        13%      23849Mi         37%
hansel07   2341m        19%      25620Mi         39%
hansel08   1369m        11%      22478Mi         35%
hansel14   1909m        15%      27661Mi         43%
hansel15   1249m        10%      43481Mi         67%
hansel16   751m         6%       19163Mi         29%
hansel17   705m         5%       19594Mi         30%
hansel18   974m         8%       19538Mi         30%
hansel20   1383m        11%      25043Mi         39%

Network usage during the snapshot period has increased from last month's snapshot, but given the changing nature of traffic patterns across the day / week, it's not possible to reach any conclusions about the changes from month-to-month. Rather, the graphs below indicate that our nodes are not contending for network throughput, and our per-tier egress rate-limits are being correctly enforced.

Why Hansel & Gretel?

Bundles are datacenter-agnostic, but nodes are specific to each datacenter, and we needed a way to differentiate US nodes from DE nodes. The fairy tale of Hansel and Gretel originates in Germany 🇩🇪

Network traffic for Apr 2025 (*hansels*)

Network traffic for Apr 2025 (*gretels)

Last month (Mar 2025)'s for comparison:

Network traffic for Mar 2025 (*hansels*)

Network traffic for Mar 2025 (*gretels)

Retrospective

Decypharr

Originally announced in Mar 2025, April saw us bravely pivot to the "beta" branch of Decypharr, for improved config editing via the UI, and WebDAV support, which means that Decypharr can now be used as a replacement for Zurg.

What's wrong with Zurg?

Nothing, it's more about what's wrong with you 😛 - Currently, when Zurg detects changes to your RD library, it initiates a full refresh, which can take time - several minutes, on larger libraries. During this time, the contents of the RD folder are not refreshed, and so automated download tools (like.. decypharr) sit around for several minutes waiting for each "downloaded" item to appear in your RD mount.

Since Decypharr's doing the downloading anyway, replacing Zurg with Decypharr makes this process totally internal, and thus Decypharr instantly "downloads" the target media, bypassing the zurg refresh-delay.

See more details in this blog post.

PostgreSQL

During April we carefully transition all our Radarr / Sonarr / Prowlarr instances from their native SQLite support, to PostgreSQL (also native, but running independently of the Aarr). Just like the Decypharr update above, this update was motivated by some of the "heavy-hitters", whose Aarr libraries were starting to slow down and thrash CPU based on the single-threaded nature of SQLite.

Here to explain it more elegantly than I, is ChatGPT:

When Radarr queues a hundred shows, And Sonarr scans where content goes, SQLite may start to groan and stall— A single thread can’t do it all.

But PostgreSQL, with power wide, Handles locks and reads with pride, Concurrency it does embrace, With queries flying—grace and pace.

So if your media starts to sprawl, And file scans echo through the hall, Let Postgres take the backend load— A smoother, faster data road.

More details in this blog post.

"The shim"

No, it's not a prison-fighting tool (that's a "shank"), and it's not a "she/him=shim" gender thing... in our case, it's a clever way to "trick" the Aars into processing symlinked debrid downloads without having to "ffprobe" each one, saving bandwidth and time when processing a large queue.

This time, I asked ChatGPT to rap a technical explanation for you:

(Beat drops...) 🎤 Yo, listen up now, let me make this clear, We talkin’ 'bout shims, so lend me your ear. It ain’t a gender thing, don’t get it twisted, Not a prison shank, so scratch that off your wishlist.

A shim in tech—yo, it’s kinda sly, It slides in the middle when code goes awry. It’s a little piece of code, fills in the gap, Makes old stuff work with the new, no cap.

You got a program lookin’ for an old API? But your system changed and said “Goodbye”? That’s where a shim steps in on cue— Talks to both sides like a tech guru.

So next time someone says “I threw in a shim,” Don’t look confused, and don’t think grim. It’s just a smart fix, a compat’ layer win— Helping software talk, like it always been.

More details in this blog post.

Huntarr

Huntarr is a clever new tool which plugs yet-another gap in the Aaar workflow - it looks at your Aarr libraries, and "hunts" missing content. While this role was previously fulfilled by Scanarr (shoutout to Puks The Pirate), we've put Scannarr to bed and fully embraced the Huntarr!

All ElfHosted Aar users get Huntarr built into their bundles.

More details in this blog post

Byparr

Flaresolverr is (was?) a tool for automatically "solving" CloudFlare CAPTCHAs employed by some indexers, so that automated tools like Prowlarr and Jackett can scrape them without human intervention. Flaresolverr hasn't worked well for a while now, and is unlikely to ever do so again.

Byparr is a modern, drop-in replacement for Flaresolverr. We've "dropped it into" all the Flaresolverr bundles, so where you see flaresolverr in ElfHosted, it's actually Byparr, behind the scenes.

The change from Flaresolverr to Byparr has also allowed us to allocate less resources for "Flaresolverr", so you'll notice that it's now included by default with all Hobbit bundles (previously only Ranger+).

More details in this blog post.

SeerrBridge

We announced SeerrBridge support last month, but the dev dropped a significant update in response to user requests - it's now possible to use SeerrBridge with ongoing TV shows, a huge QOL improvement for our initial beta-testers.

More details in this blog post.

Bundles Simplified

We've switched the format of the streaming bundles in the store over to "variable" products, which means that instead of 50+ possible product variations, there are now only 6 streaming bundles, each of which can be customized for your preferred media player, cloud provider, and media automation (existing subscriptions unaffected).

In order of increasing awesomeness, and now with nice and easy-to-remember URLs, these are:

  1. Zurglings (contended, no automation)
  2. Starters (contended)
  3. Hobbits (semi-dedicated)
  4. Rangers (semi-dedicated)
  5. Halflings (semi-dedicated)
  6. Nazguls (dedicated)

All bundles come with your choice of Plex (PlexPass required), Emby, or Jellyfin, can be managed (except Zurglings) by either Riven, SeerrBridge, plex_debrid, cli_debrid, or Radarr/Sonarr/Prowlarr, and support RealDebrid, AllDebrid (VPN required), Premiumize and EasyNews (app-dependent).

Mooar Apps

ListSync

ListSync "feeds" your OverSeerr / JellySeerr with content from Trakt, IMDB, or Letterboxd lists, allowing you to sync your libraries with your own (or other public) lists.

Letterboxd Trakt Sync (beta)

This one was added in response to a request from an Elfie - Need to sync your Letterboxd "diary" with your Trakt account? There's an app for that!

Profilarr (beta)

A long-time EEP, Profilar is a UI-based tool to manage quality profiles in the Aars.

Initial testing by @Layezee20 has been very positive, and if you're an early adopter who's interested in fine-tuning your Aar quality profiles, this may bring you hours of delight!

No documentation yet, but it's follow-your-nose easy, with a subscription available here.

AudioBookBayAutomated (beta)

AudioBookBay Automated is another bleeding-edge app which needs testing, but if you're an audiobook-phile, it may tweak your interest...

Coming up

US East Coast DC

It turns out that part of the delay in establishing our PA site has been related to the hardware supplier of the equivalent configurations we're using in the west-coast site, (elfhosted.cc). After some back-and-forth with our friends at Crunchbits, we've decided to pivot, and invest more heavily in the PA site, owning our own hardware outright, and scaling up capacity by 200-300%.

The appropriate 🔥 has been lit, and we're getting high-priority, VIP treatment with a mind to establish the PA site by the end of May! 🤞

Our very own ElfVenger, @BSM, has been working on a new, Elf-sclusive app, cleverly named "Symlink Cleaner" (he took a break during much of April for IRL responsibilities!) Symlink Cleaner is currently available free as an early-access trial, after which time we intend to roll it out to all Elfies, free of charge.

Even mooar apps

Apps currently requested can be found (and submitted!) here

Notable suggestions:

Your ideas?

Got ideas for improvements? Send us an EEP (ElfHosted Enhancement Proposal) here!

How to help

Another effective way to help is to drive traffic / organic discovery, by mentioning ElfHosted in appropriate forums such as Reddit's r/plex, r/realdebrid, and r/StremioAddons, which is where much of our target audience is to be found!

Join us!

Want to get involved?

Want to get involved? Join us in Discord!

What users say..

Here's what some of our usersfriends say..

I am new here, but today I learned realized that Elfhosted is one of the best free and open source software communities I've seen, and FOSS communities have been at the center of my life since the 90s (Perl, PHP, Symfony, Drupal, Ethereum, etc.). Great open software built by great people who care = great community, and that is something special.

You've done an amazing job @Funky Penguelf with the platform you provide and this place has an awesome mix of active community caretakers and software creators that I've seen here so far like BSM, Spoked, LayeZee and other elf vengers. Keep up the energy, productivity and community and take time to enjoy it and appreciate each other!

⭐⭐⭐⭐⭐ @skwah (Discord)

I self host and share a fully automated ‘arr stack with Plex. Been doing so for around 4 years. Also recently got into real debrid and hosting a Comet and Annatar for Stremio. The amount of time and head banging I’ve put into it is in the hundreds to thousands of hours. From setting it up to keeping it running smoothly. Let’s not forget the cost of my server and how much it cost to keep it running.

Anyway I wanted to see what ElfHosted was about to compare. Yeah I had the whole thing setup in just a few hours. It also passes the headache of maintaining it to ElfHosted. Will I keep it no because nerdy things and maintaining my server are my hobby and quirky passion project. Will I recommend it to my friends who don’t have the money up front to buy a server, the knowledge to maintain it or desire.

Just my server alone was $2k. Power cost to keep it on yearly is $250ish, annual memberships to RD, Usenet and indexers are around $100. Then whatever a value my free time at. Which is currently at minimum my hourly pay at work or more. Yeah so take the monthly cost of all that and compare to ElfHosted Ultimate Stream package at $39 monthly, add RD to the cost and get nearly all your time back is incredibly cheap.

Lastly it seems like a lot of people forget how quickly an ultimate cable package used to cost. Or how quick paying for every stream service would add up to. Which when using ElfHosted with RD is essentially and more what you get. Quick hint it’s far above the asking price.

⭐⭐⭐⭐⭐ /u/MMag05. (Reddit)

As a happy Elfhosted customer—who also self hosts MANY things across about 10 severs (dedicated, VPSes, and VMs running on Synology), I wouldn’t switch to self hosting the services I get from Elfhosted. They just work with very little effort configuring things, and the support the owner and his team provides is second to none. Plus I love being part of a fledgling—but quickly growing—enterprise.

⭐⭐⭐⭐⭐ /u/jatguy. (Reddit)

I recently found ElfHosted and decided to start out with the Infinite Starter Kit. Within a week I realized that this was for me and upgraded to the Hobbit plan. Give it another week and I was up to the Ranger plan.

I just love the simplicity and the fact that things just work. For years I've ran a home server and between the constant maintenance and always upgrading harddrives, it became apparent I wanted to make it easier on my self. Enter ElfHosted.

Setup was super easy with the guided documentation and the discord community. It seems that somebody is available at all hours of the day to help with questions. I started with the Aars, which I knew from my prior hosting... but saw a newer product called Riven. I decided to jump in feet first. I enjoy being on the front end of an up and coming replacement for the Aars and will soon be upgrading to the annual plan!

⭐⭐⭐⭐⭐ @.theycallmebob. (Discord)

I’ve been using this service for a while now, and honestly, it’s a game-changer compared to anything else I’ve tried for managing my media library. The support is fantastic—super quick, and if the staff aren’t around (which rarely happens), the community steps up right away. I can’t imagine going back to any other platform.

Before this, I had my own setup with a NUC, NAS, and tools like Sonarr and Radarr. It worked pretty well for a while, and my internet speed was high enough to stream without any buffering. But in the end, it wasn’t worth the time or headache of managing all the storage and keeping everything running smoothly.

Now, with this service, everything runs smoothly in 1080p+ with no buffering issues. The interface is really easy to use, which makes managing everything a breeze. Plus, having a whole community of smart people available for guidance is a huge bonus.

I was sold from the start, which is why I quickly upgraded from a 1-month to a 3-month subscription, and I’m planning to switch to a 1-year plan soon. This service totally pays for itself, and I’m sure you won’t be disappointed. It’s been really impressive.

⭐⭐⭐⭐⭐ @seapound (Discord)

Best possible options for anyone looking for the do-it-all option along with the best customer service ive experienced in this space so far. Id rate it a 6 if I could but its limited to 5/5...

⭐⭐⭐⭐⭐ @hashmelters (Discord)

(responding to a Reddit thread re the cost of ElfHosted vs mainstream streaming / self-hosting):

I didn't know that the goal of this project was to compete with large companies running/renting entire DCs. I was under the impression that the goal of this project was to manage the updating of almost selfhosted applications on a shared platform with other users. Basically, be my sysadmin for me.

That being said, paying for services is the 'easy button'. There is a real world cost incurred for the time saved. Time is money. Time is the most valuable currency that exists. Once time is spent, it's forever lost, one cannot retrieve it again (yet). In my mind, there are 3 options for use of time with respect to: mainstream, selfhosting, elfhosted.

  • mainstream - my time is valuable and I don't want curated content and I don't care what content that I have the ability to consume. I only like what's popular.

  • elfhosted - my time is valuable, I want my own curated content without being forced to browse past the same damn entry 500 times just to find out that I can't watch the movie I want because it's not available in my current location or was removed last week from mainstream providers.

  • selfhost - I care about costs and I have nothing but time to waste or I want to learn about the backend of the systems involved. I'll pay for my own VPS/homelab, electricity, manage the OS, manage app updates, figure out how to make the apps talk nice to each other, create my own beautiful frontend.

I know how much my time is worth, does that reddit poster know how much their time is worth? Without knowing what you are worth, you can't make effective capital expenditures with respect to the time it will take to recoup the capital.

I know I don't need elfhosted at all for my use case. I choose to stay with elfhosted because it's my 'easy button'. It's an efficient capital expense for the amount of time it saves me managing my own hardware, apps and saves me electricity costs. I'm also in a situation where I don't have upload bandwidth from my home to serve HD content to myself remotely. If I lived back in a city, I would still be here. My time is worth $$/hr.

⭐⭐⭐⭐⭐ @cobra2 (Discord)

"Just wanted to check in here and let @Darth-Penguini and anyone/everyone else know...WOW. I have been struggling with storage for years, maintenance of Docker containers, upkeep, all of it. Elfhosted is so freeing. It's an amazing service that I hope to be a member of for a long, long time!"

⭐⭐⭐⭐⭐ @Fingers91 (Discord)

"I just have to say, I am an incredibly satisfied customer. I had been collecting my own content for nearly 20 years. Starting off with just a simple external HD before eventually graduating to a seedbox with 100TB of cloud storage attached and fully automated processes with Sonarr and Radarr . However, the time came when the glory days of unlimited Google Drive storage ended. I thought my days of having my full collection at my fingertips via :plex: were behind me, until I found Real-Debrid and ElfHosted.

Now I essentially have the exact same access to content as I had before, but even better. Superior support and community involvement. Content is available almost immediately after being identified. A plethora of tools at my fingertips that give me more control and automation than ever before. Wonderfully well done and impressive! I am looking forward to being a customer for a very long time! Massive kudos to @funkypenguin 🤟

⭐⭐⭐⭐⭐ @BSM (Discord)

"I would recommend ElfHosted to anyone. It has been great so far and made life a lot easier than running my own setups. If you’re in the fence give them a try and help support this great community."

⭐⭐⭐⭐⭐ Zestyclose_Teacher20 (Reddit)

"thanks for the help and must say this is the best host I every had for my server 🙂 10/10 🙂 All other places I have try have I got a lot buff etc. Your host can even give me full power on a 4K Remux on 200GB big movie file . That's damn awesome 😄"

⭐⭐⭐⭐⭐ @tjelite (Discord)

"What an amazing support system these guys have Chris and Layzee i think it was! Both are very patient with me even though I am a newbie at all this. Very thorough and explained everything step by step with me

I couldn’t ask for anything better than the service I have received by these guys! Happy happy client❤️"

⭐⭐⭐⭐⭐ @dead.potahto (Discord)

"Very happy customer. Great service"

⭐⭐⭐⭐⭐ @ronney67 (Discord)

"Very good customer service, frequent updates, and excelent uptime!!!!!"

⭐⭐⭐⭐⭐ @ed.guim (Discord)

"I had my own plex-arrs setup on hetzner for years. Yesterday I deleted everything as elfhosted has gone above and beyond it. And it has a fantastic, active community as well! Very friendly, helpful and like-minded folks always willing to help and improve the system. Top notch!"

⭐⭐⭐⭐⭐ @alon.hearter (Discord)

"Absolutely Amazed with the patience and professionalism of all Elf-Venger Staff including bossman penguin❤️"

⭐⭐⭐⭐⭐ @dead.potahto (Discord)

"@BSM went above and beyond to make sure I had all the one on one support needed with my sub. Thank you for your patience! Elfhosted continues to be Elftastic !!"

⭐⭐⭐⭐⭐ @bfmc1 (Discord)

"really enjoying the service from elfhosted. The setup is really easy from the guides on the website. And the help on the discord channel is really quick."

⭐⭐⭐⭐⭐ @jrhd13 (Discord)

"Support is amazing, and once you find a setup which works best for you it works perfectly, very happy 😊"

⭐⭐⭐⭐⭐ @fiendclub (Discord)

"great fast service, resolved my problem and really friendly"

⭐⭐⭐⭐⭐ @allan.st.minimum (Discord)

"Great service and sorted out a billing issue super quick and easy."

⭐⭐⭐⭐⭐ @scottcall707 (Discord)

"Very friendly support, resolved a problem with my account! I also appreciate the community that has been built around the service!"

⭐⭐⭐⭐⭐ @leo1305 (Discord)

"excellent customer service and very fast replies"

⭐⭐⭐⭐⭐ @yo.hohoho (Discord)

"Loved the simplicity, experience and support"

⭐⭐⭐⭐⭐ @y.adhish (Discord)

"Very friendly help as always, problem solved, one happy elf here!"

⭐⭐⭐⭐⭐ @badfurday0 (Discord)

"Great Helpful and Fast support. Thanks!"

⭐⭐⭐⭐⭐ @.mxrcy (Discord)