With the release of the latest expansion for Blizzard’s Hearthstone we sit down with the development team to learn about its development.
Hearthstone Interview - Inside the Halls of Scholomance Academy
With Sony and Guerrilla Games’ Horizon Zero Dawn hitting CPUs and GPUs, Kosta finally steps into the post-apocalyptic shoes of heroine Aloy to slay some robo-dinos.
Horizon Zero Dawn is the Best Game I Haven’t Played
The ASUS ROG Zephyrus G14 is one of the most talked about gaming laptops of the year for good reason - it packs the AMD Ryzen 9 4900HS mobile CPU in a tiny package.
ASUS ROG Zephyrus G14 Review
The World Game is back, natch, so we go through all the big changes coming in EA’s FIFA 21.
FIFA 21 Preview - Inside the Big Gameplay Changes Coming
Games Server Fervor Part 2 - The Difference Between Closed and Open Servers
Post by Dan @ 03:29pm 08/08/14 | Comments
In a previous article we discussed the difference between game servers that are considered to be dedicated, and those that aren’t. But not all dedicated servers are created equal. This time we delve into the differences between open and closed server solutions, and the impact they can have on the multiplayer games we buy into.

As mentioned in part one, games such as MMOs are created as closed services by necessity, as in order to facilitate the large number of players connecting to each other and maintain integrity of the persistent character information, servers can’t simply be hosted in the background of the consoles and computers players are also playing the game on.

That’s not the case with smaller scale multiplayer games however, and in the late 1990’s/early 2000’s -- the golden age of LAN gaming, before broadband ubiquity encouraged us to stay at home -- most first person shooter titles with network multiplayer used an open dedicated server model. The likes of Quake, Unreal and Half-life all empowered players with the choice of running their own standalone dedicated servers, in addition to those hosted online by the developer or publisher themselves.

This model allowed players to run servers with high player counts at LAN events without an Internet connection, and back online, it enabled the public to set up their own servers in datacentres with better connectivity, at effectively zero expense to the game publisher.

Publicly-hosted servers also have the advantage of empowering a game’s community to self-moderate. When the players operating the servers have a greater degree of control over them, they can proactively ban troublemakers selectively from only their servers, rather than having to depend on blanket bans from a centralised authority that’s not always the best judge.

Further, and perhaps most significantly, publicly-hosted dedicated servers empowered the development of user-created content and multiplayer mods, making the likes of Counter-strike and Team Fortress possible and extending the lifespan and value of most games that offered such support.



The turning point from this predominantly open server utopia to the mostly closed up one we see today can largely be attributed to the rise of online multiplayer gaming on consoles. The walled-up console ecosystems with their stringent software certification processes and secure networking environments made it prohibitively difficult for developers to allow players to run their own games servers.

A handful of the earlier multiplayer console shooters such as the original Halo, and Return to Castle Wolfenstein: Tides of War actually included the ability to run a console as a dedicated server, but while that option was useful for things like LAN tournaments, it wasn’t practical for public users to host such servers in a datacentre like with PC games, as for a start, you could only launch the server via an in game menu on a regular Xbox or PlayStation 2 console.

So then why can’t the dedicated server software for a console game just be compiled to run on PC hardware instead? Well, that’s precisely what happens for the dedicated servers that console game publishers operate themselves, they’re all managed on Linux and Windows systems in big datacentres across the world. As to why such software is never made publicly available, we can only guess that it's primarily due to the security requirements of the online services (Xbox Live and PSN etc) that console multiplayer games are mandatorily woven into.

Another hurdle is any secondary service that a developer has wrapped around its multiplayer games for keeping track of player progressions with stats, leaderboards, weapons and all that jazz. With these kinds of features now all commonplace in many multiplayer games, permitting the public to run their own servers means drawing a distinction between ‘ranked’ and ‘unranked’ matches, where games played on a publisher’s own integrity-secured servers will contribute to your character progress and place on the high-scores list, but those played on open servers will not.

There’s a middle-ground available for publishers to work with trusted third-party game server providers (GSPs) -- such as i3D, GameServers, Hypernia, and yes our very own AusGamers Server Rentals -- in order to maintain service integrity and prevent players cheating the game’s stats system, while still allowing players to control and moderate their own servers by way of rental from the GSP. This is how the PC versions of Battlefield 3 and 4 handle things, but again, the console versions remain on lockdown, presumably due to the inflexibility of the platform holders’ additional service layers (Xbox Live and PSN).

Perhaps unfortunately for proponents of open servers however, the growth of cloud computing services is making it more and more economical for publishers to supply their own closed servers. Up until now, in order to operate their own dedicated servers, publishers had to either provision expensive racks of computing hardware themselves in datacentres around the world, or pay third-party hosting companies to manage them (at even greater expense) -- requiring a measure of clairvoyance to get the right balance between meeting demand during a game’s busy launch period, and being left oversupplied if player numbers drop.

Cloud services look poised to help minimise that risk, as we’re seeing with Titanfall’s server deployment on Microsoft’s Azure cloud computing platform. With what is effectively a timed CPU and memory rental process, the game can boot up as many servers as it needs during popular times, and switch them off just as quickly when they’re no longer needed, with the free resources distributed back to the cloud pool, automatically repurposed for whatever else other Azure customers might be using them for.

One problem with using cloud platforms for games server applications at the moment though, is that they don’t all have the same global coverage, as our fellow gamers in South Africa know all too well. Microsoft’s (currently) closest Azure datacentres to Johannesburg are in the UK and The Netherlands, offering unsatisfactory latency which led to the cancellation of Titanfall’s launch plans in the region. Admittedly this will likely only be a temporary issue as the ubiquity of cloud service platforms continues to expand, and will continue to do so in the coming years.

The adoption of cloud services could mean servers for older console games sticking around for much longer than previously. With a publisher’s games servers able to instantly scale up and down with demand, the cost of keeping them online when only a handful of people are left playing each night should be more affordable. That being said, there will always still be some degree of ongoing expense that publishers will be left weighing up when an older game’s active player numbers dwindle too much.



For games where players are able to operate their own servers, the availability of legacy play is not even a question. id Software’s original Quake launched in 1996 -- 18 years ago -- and you can still fire up a server and frag like it’s 1999 today. Want to play a game of MAG though? Sorry folks, your discs for this 2010 PlayStation 3 exclusive shooter that offered up to 256 player matches are effectively coasters now, as Sony discontinued its publisher-provisioned dedicated servers entirely.

Such is the potential fate of any online multiplayer game limited to publisher-hosted servers, and for that matter, most games that depend on a publisher-hosted service layer for character authentication and matchmaking or anything like that too. Whether or not any of these types of games you purchase will still be playable in 10 years depends entirely on the state that its publisher is in at that point in time. The recent GameSpy shutdown illustrated this all too well, with some titles fortunately being saved by community groups able to modify and hack in their own server support, but others will likely never be multiplayable again.

Companies like Blizzard have demonstrated an enduring commitment to their old online service-dependent games -- Diablo 2 is still powered up almost 14 years later, at this point surely just running as a goodwill gesture. That’s an exceptional example though; try playing online multiplayer in an EA Sports game even as recent as Madden and FIFA 11.

The preservation of older game purchases alone is a good reason to favour games with open server models over those that are locked down by a publisher or developer, and yet its an aspect that we barely pay any attention to when anticipating and critiquing upcoming games.

In part three, we’ll explore the other big anti-consumer factor in the closed versus open servers comparison, digital rights management (DRM), and consider just why it is that the wider games community isn’t demanding publishers open up the server models for their multiplayer games.



Latest Comments
No comments currently exist. Be the first to comment!
Commenting has been locked for this item.