Future-Proofing Your Social Media Gaming Content Pipeline

Print Send via email

While the current global recession has impacted the traditional AAA title game studios and publishers with mixed financial results, today we are seeing new forces pushing the window of game development with favorable results and exciting potential. In the current state of the game industry we are seeing a surge in the popularity of social networking which has cast casual gaming into the forefront of game development and profitability while garnering mind share from new or non-traditional game developers. (See Zynga, BigFish, PlayFirst games etc. and the technology behind Unity3D, ShiVa3D, iPhone games, etc. for examples). Casual gamers go hand-in-hand with virtual goods, virtual currencies and user-generated content (UGC). Secondly social media gamers today expect friction-less web interaction.

Let’s examine these two forces that may drive social media gaming:

1 – User Generated Content (UGC) – is on the rise and moving into mainstream, obvious first in the non-3D space (think YouTube and flickr for instance) and now, thanks to “the tool is the toy” model of games such as Spore and Little Big Planet, this is likely the forerunner to social community based 3D game content creation. As of April 8, 2010, there are more than 141,503,213 registered independent Spore creatures (http://www.spore.com/sporepedia) for instance!

Similarly, social networking games drive users to share content, as in most Facebook games like the exceedingly popular FarmVille, with over 82.7 million users monthly, designed from the start to leverage the social networking aspects of Facebook. Sadly, these popular social network games do not (yet?) provide easy-to-use content development tools so users can create their own models to share, whether in-game or with other games/players. And they certainly do not work cross-platform. So, here we go again with a new generation of “walled garden” applications, this time in casual gaming instead of virtual worlds. But one can envision the desire amongst end users is there and already this sentiment is emerging from analysts and popular blogs. See “Time to Reject Content App Silos” by Ron Miller.

Also, we would not be seeing millions of users wanting to share their Spore creatures for instance, or Farmville players gift their pets to others for instance. To gain some in-game advantage, imagine what well-designed UGC could net for social media game players! Users will build individualized content in droves for these games as soon as easy-to-use tools (e.g. something analogous to Spore Creator in the cloud or SketchUp in the cloud) are provided to players and content creators alike who can build new content and/or access and re-purpose existing repository content. Even today, UGC is often available as free 3D content (e.g. as found in Google 3D Warehouse, 3DVIA, etc.) as well as in commercial 3D applications, such as in virtual world builders, animation tools and databases.

Note that content repositories and standards go hand by hand as it is hard to serve/sell content if it has to be maintained in many formats, and the larger the content base in a given format, the greater the popularity of the format itself (think VHS vs. BETA, Flash vs. QuickTime, or Blue Ray vs. HDdvd) – all close in features and format, but content popularity made all the difference in adoption and success. Likewise, UGC will make the difference in championing a lasting 3D format standard, simply because UGC is best when shared, and the consumers of the desired content will be playing different games and/or using different applications. A commonly used, open consumable format, analogous to something like MP3 for example, is the only solution in this case.

2 – “Plug-in Tree” Web Browser – The development and fast adoption of native web rendering and “plug-in free” Web browser support for 3D, as witnessed by the interest and participation in the Khronos Group’s WebGL project (http:/www.khronos.org/webgl). WebGL brings OpenGL|ES 2.0 to the web by providing a 3D drawing context to the HTML5 Canvas element through JavaScript objects that closely resemble OpenGL|ES 2.0 constructs. This eases the burden to develop 3D for the web without needing to install a 3D web plug-in. Note that initially games built with WebGL will likely be relatively simple 3D model interactivity games without full game physics, advanced animation and the like which would naturally require plug-ins, or require very talented programmers to get the performance needed from JavaScript.

From these examples above and those unforeseen, new and compelling content will be developed, which has the potential requirement to be archived for future revisions and games, to be later modified by developers as well as end-users, and then re-purposed in game play. Savvy developers will turn more and more toward adopting open standards API’s and formats to future-proof their games and content as well as maximize precious resources.

Current Practices for Social Media Gaming Content Pipelines

In light of the opportunities presented above, state of the art open standards for social media gaming content pipelines is something we’ll see soon, but probably not in the very near future. Why? Because today the business model and options for developing social media games requires developers to generally choose a platform or platforms first, then find the best tools (and engines) for said platforms. Very few development platforms today are designed with any real cross-platform game development strategy in mind. This is a near-sighted business model that their customers will likely reject in time, as more and more incompatible games arise, eventually frustrating users. Ditto for copycat games (without cross-platform communication) on multiple platforms. I recently posted a question on LinkedIn to seek out tools for making cross platform games and/or game engines and only several options were proposed but none were fully cross platform and all required web plug-ins (no surprise there). Some engine providers will support open formats only to enable a pipeline that can accommodate artists using the widest possible number of content creation tools, thereby securing more and better content for their engines, but end users desires for cross platform game play does not yet appear to be a factor in business decisions. Practically speaking, only game selling content drives the licensing of engines, and that will not change any day soon.

Since social game developers tend to use standard well-known art and design tools (Flash/Papervision3D, Unity3D, Flex, iPhone SDK, etc.) for the engines available today, it doesn’t appear that an open standard content pipeline is in demand yet for social gaming development. I think this will change once end-users of social games have easy-to-use content creation tools that will inspire them to become artists and developers themselves. We are already seeing the nascent desires of such with the interest in and explosion of virtual goods repositories. Who wouldn’t want to make content that ones friends playing games on different platforms could exploit? In some discussions, this has become something of a holy grail for the virtual world builders of late.

Imagine something like a WebGL enabled SketchUp-like tool running in the cloud that is designed explicitly for social media gaming content development? The number of adopters for such a tool would be exponentially larger than the SketchUp users today who download the free version for crafting mostly 3D structures that do not necessarily have a home outside of the Google Warehouse! So, how do we get to this social media end-user content/gaming nirvana? We need a single open pervasive way of interchanging, archiving and reusing expensive content.

Need for an Open Social Media Gaming Content Pipeline

Before exploring the evolution of standards for content pipelines, let’s think about the game’s end users – your customers. What will social media gamers demand? Social media gaming is, for the most part, web gaming where users will expect open and lightweight platforms that they can access at home, in the office, on the road, from their desktops, laptops and mobile devices alike. In Tim Chang’s article Gaming Will Save us All, in the March 2010 issue of “Communications of the ACM”, Chang identifies “ubiquitous gaming” to define the growing market of non-core gamers introduced to gaming via small personal devices. These “digital natives” as Tim calls them, live their lives online favoring consumption of digital formats (on most any digital device) over any other type of media; that, coupled with short attention spans means these natives will shy away from anything too time-consuming to access. These same users also favor web based games and web content such as Facebook, AddictingGames.com and MySpace games, to name just a few, while the more hard-core gamers will frequent portals such as PlayStation Home and Xbox Live Community Games. This yields a social media gaming graph that intersects with the cloud across all media. Selling to the customers of this social media gaming graph will inspire developers to create applications for all the open and lightweight platforms they can support. This is achievable today by implementing only what is needed from traditional content pipelines and putting that solution in the cloud.

Open Standards for Cloud-based Social Media Gaming Content Pipelines

If you open up the tools pipeline to enable developers to use a variety of independent tools, you ease the introduction of new technologies, making possible the adaptation of the content pipeline to be used by many for various genres on all sorts of engines. From this perspective, social media gaming has similar content pipeline requirements as console or PC games, especially if UGC, modding and/or machinima is a desired outcome.

There are many good resources defining in-depth technical details of creating and managing content pipelines. For a seminal technical background study, see Ben Carter’s The Game Asset Pipeline book from Charles River Media, 2004 which will guide you in crafting a solid content pipeline for your game development. Of course, for casual game and social game development, the XNA Content Pipeline provides worthy, well thought out content pipeline development guidance. However, there is no strong definition yet of what is needed to create a cloud-based content pipeline, and the current implementation model of content pipeline interchange format development focuses on file interchange and not data (or content) interchange. This file-based approach is inefficient and daunting for most social game development, does not harness the “power of the cloud” and begs for simpler methodologies yet to be defined.

If your business model dictates a future of users building, buying (or sharing) and repurposing virtual goods, then a likely requirement for your content pipeline is the ability to “future-proof” UGC from existing games (and tools masquerading as games – think Spore for instance). The most obvious and useful format today for a content language for communicating between tools and applications is the open standard COLLADA.

Visit the Khronos Group’s COLLADA project at http://www.Khronos.org/collada and COLLADA – Sailing the Gulf of 3D Digital Content Creation by Rémi Arnaud and Mark C. Barnes from AK Peters, Ltd., 2006 and more recently, Rémi Arnaud’s chapter on The Game Asset Pipeline in the book Game Engine Gems 1 from Jones and Bartlett Publishers, 2010 for a review of COLLADA and how it works. However, COLLADA as adopted today is not implemented in an optimized fashion and thus will not work well in practice for most casual game developers wishing to develop 3D social media games, particularly those only web or browser based. However, a content pipeline standard is crucial when you need to port the same content to very different platforms, and COLLADA is still well suited to this task.

First of all, the content pipeline allows separation of artist and developer work while reducing engine / DCC format interdependencies. This is important as artists have different skill sets and know and use many different DCC tools, but are unfamiliar with every game engine tied to various platforms. A standard content pipeline language divorces them appropriately from needing to understand engine nuances. Secondly, many standard importers and processers will then be available for independent content, and importers put your DCC content into the game where you want it, while processors deal with it in the game. Thirdly, a well defined standard content pipeline should provide game developers with extensibility: you can always write your own importer for a new custom file format if desired, but a standard format will cover most use cases. Lastly, a well-defined content pipeline, as specified by an intermediary language, not an interchange format, will guarantee simple interaction from the game back to pipeline, paving the way for user generated content tools. COLLADA has been defined as an XML based intermediary language for this reason.

As for today’s existing UGC situation, popular content repositories such as Google’s 3D Warehouse, Dassault’s 3DVIA, rogue Spore models, and Papervision3D content, for instance, all support COLLADA, allowing any of the applications that use their data to import or export 3D content to any other application supporting COLLADA. This puts COLLADA in the driving seat format wise for 3D repository content, particularly for on-line 3D content delivery. What is missing is a protocol and/or specification for communicating about COLLADA based content in the cloud. Currently, COLLADA is generally implemented using a file based approach to interchanging data, but what the web based gaming industry needs is a client/server based approach to enabling communicating only the content needed (at a specific instance of time) – an in the cloud pipeline for the engine to digest as needed, when needed. I.e., a message based as opposed to a file based approach to interchange. Such a cloud based pipeline approach could be hugely beneficial to developing the next generation of social media games.

Figure 1 below depicts a forward-thinking content pipeline where COLLADA plays a significant role in empowering the editor for the end-user’s advantage.

 

 

This extends the editor as not only a content preparation and purposing tool for a specific engine, but it can now be used as a content creation tool in general for other purposes, such as adding social gaming features. As more and more games have to be created for a variety of platforms, the editor may be the best tool to create content if there is a business case to export the content back out into the COLLADA .dae format so one can use the content on other platforms (3D Web, Facebook, PaperVision3D, iPhone…) as necessary.

Beyond this however, one can envision a need for a cloud-based editor being used as a content creation tool in and of itself, one for easing developing for multiple platforms, and two for use as a tool that can be given to end-users for the purposes of modding, for instance. Hence, the evolution of a cloud-based game editor as a content creation tool in and of itself.

Evolution of the 3D Web

Another technical trend coming up and one that should greatly impact and improve the user experience of social media gaming is the emerging technology for enabling a native 3D web. There are several ongoing efforts and perspectives in this space. The most prevalent one however, is native web rendering, which is the ability for a 3D application to be realized in a browser without the need for a (application-specific) plug-in. The Khornos Group’s WebGL project is a collection of popular browser companies’ initiative to bring OpenGLES 2.0 hardware rendering to software browsers defined within a JavaScript API. WebGL may be well suited in time for 3D casual games on the web if solving the rendering performance issue is the main concern. One can certainly imagine a WebGL based web site for showing off models or a WebGL viewer of user content but as soon as one has to write a plug-in (which is the case for games as soon as there is a performance need for audio, physics, AI, etc. and not just graphics), and where you have to make that plug-in work on several platforms, you may not want to rely solely on WebGL. Ditto for building and selling non-open source applications. See the popularity of Unity3D and ShiVa3D engines for instance. The usefulness of such products is not going away any day soon.

Also, it’s not likely that WebGL will be used conjointly to plug-in based solutions. If WebGL works for a casual game, than it should be used, otherwise build out a complete installation or fully functioning plug-in.

This said, a complimentary approach to bringing content “client-side” when you need it (without the need for specialized plug-ins) is something like a COLLADA database deployed in a software-as-a-service or SaaS model/API. The basic model is of a standard XPATH query mechanism returning a subset (or aggregation) of COLLADA documents as XML. From that basic model we can create a web services API, where one of the API’s features can be the creation of a WebSocket stream between the application and server, if the browser/server/application has support for it. There is already a definition of such an API (http://en.wikipedia.org/wiki/Web_Feature_Service) in the GIS space but this is limited to GIS type queries. Web services may also revolve around a REST protocol like concept. REST fits well with an XPath/XSLT kind of solution that COLLADA, as an XML format, works seamlessly with.

The content pipeline in the cloud is indeed where this is all going. The server can create the optimized model based on all the raw data in its database, drawing on the information provided by the web application. The same exact application running on different hardware receives all the information (such as screen resolution, memory size, bandwidth, version and vendor or the browser, client credentials…) that should be sent to the server in the request as described above for the server to process the data on the fly and deliver to the thin client the right data at the right time. See Figure 2 for such an example.

The devil of course will be in the details and identifying all the likely possible use cases will not be straightforward, but not impossible to define. One can envisage some kind of discovery mechanism, whereby as the client connects to the server there is an exchange of capabilities to determine what actually should be delivered to the client. This negotiation of capabilities should be designed to be as forward compatible as possible – in other words if the client is not capable of the latest and greatest features it should be able to gracefully fall back to displaying content in an older format. In fact – the same content could be made available to different users by different means depending upon their needs and expectations – WebGL in some cases, plug-ins in another, standalone app in yet another scenario.

A developer should do some experiments to assess where the split should go to determine the server-side and the client-side responsibilities. For example, in the case of a shared environment (multiple users viewing the same scene) the physics of the scene applies to all users, consequently collision detection and physical simulation should be server-side. This would also mean that low-end devices (phones, netbooks etc.) would be able to act as user interfaces to much more sophisticated worlds than their own CPUs could support.

One way of looking at it is that there are things that are shared experiences and things which depend upon the users view point. For example, AI and physics are shared experiences in a given world or situation, but culling and audio depend upon the users viewpoint. The other factor is how to minimize the amount of data downloaded to the client and this aspect requires that the server be “aware” of the client’s viewpoint. Minimization doesn’t mean that the server always sends a compressed copy of an entire COLLADA file, or sends over an optimized binary file, but instead reacts to a call from the client asking for a subset of actual COLLADA data, using the COLLADA language to communicate only the portion of data needed at that time. Perhaps a full scene-graph could run on the server, with a “shadow” version being run on the client. The client scene-graph would be dynamically modified as required by combining manipulations of the server version due to state changes or user interaction with those changes resulting from the client viewpoint being manipulated by the local user.

Closing Remarks

This article merely touches on the idea and work that would need to be done to bring a truly useful and open content pipeline standard to social media game developers. There is a vast sea of information not addressed here from Internet protocols to cloud standards for communications, to protocols and data dependencies, the need for content security and DRM techniques in the cloud, etc. etc. Likewise, a 3D content database needs to be structured to provide efficiency and value to developers and content creators.

Of course, the end result should be a social game that is fun and brings together a community of devoted players. Social games will evolve, like all technology, and the quality of graphics and the richness of game play with them will improve as well.

M2 Research SurveysMore

  • December 2013

    Join our new “Industry Insights”

    To better serve the industry we have set up Industry Insights, a new research service to support developers and publishers. We ask...

Executive VoicesMore

  • Sign up for the GBR Newsletter! Enter your email address here:
  • Facebook Like Box

  • Some title