Knud Böhle, Michael Rader, Arnd Weber, Dirk Weber
If we cast our minds back to the beginning of the new millennium, European policy makers at that time were worried that Europe was lagging seriously behind other regions of the world in terms of Internet access and application and that this would continue to be the case. According to EUROSTAT, in 2007, seven years after the announcement of the Lisbon Agenda, well over half of European households (54%) had Internet access, the majority over broadband (42% of households). In the leading countries, over 80 percent of households have access, well over three quarters of them via broadband. There are predictions that 3 million new jobs will be created by SMEs operating in the sector by 2015.
Due not least to the rapid development and diffusion of new information and communication technologies, new uses of the Internet have become possible, providing new business opportunities for a broad range of people, leading to new patterns of consumption and production of digital goods. While the changes have seldom been sudden, the result is a new quality of the Internet which has been labelled "Web 2.0". This is linked with terms like networked electronic media, user generated content, social networking and many others which can easily confuse the observer watching developments at a distance.
The aim of this report is to help unravel the concepts underlying the development of the Internet and to point out critical aspects which might require the attention of policy makers in the foreseeable future.
In its Introduction, the report clarifies the basic concepts of "networked electronic media", "Web 2.0" and "User Generated Content". Chapter 2 deals with Technological Developments and Technology Visions and provides facts and figures about innovations of hardware, software and networks important for media industries and Web 2.0 media formats. It considers at some length semantic technologies and the vision of the "semantic Web", and closes by looking at long term media technology visions. Chapter 3 positions European Media Industries in the global context, analysing the audio-visual sector, the gaming sector, and the mobile Internet in more depth. Chapter 4 is devoted to networked electronic media associated with Web 2.0 and User Generated Content. The basics of Web 2.0 media are explained in terms of the users' media experience and in terms of Web 2.0 business models, before addressing UGC-Platforms as a specific type of media. The final Chapter 5 "About Exploitation, Remuneration and Copyright Policies in Web 2.0 environments" describes the new media business and outlines policy relevant insights. Implications and side effects of these new media focus on the possibility of hidden exploitation of the "prosumers" – consumers who also produce content –, potential impacts on the labour market in the media sector, with respect to privacy and with respect to more general transformations of the media industries due to automated or semiautomatic media production. There is then an extensive discussion of the appropriate (micro)payment infrastructure for the Web 2.0 environment. Last not least the issue of Digital Rights Management technologies is raised in the context of copyright policies in Web 2.0 environments.
Web 2.0 is shorthand for recent trends in Web-technologies, a changing networked media landscape with new business models and visible changes in the way people communicate via Internet. Beyond a series of new forms of networked electronic media such as blogs, wikis, social networking sites, video sharing platforms and photo sharing sites, Web 2.0 can be regarded as an environment based on a homogeneous underlying infrastructure. This can bring together the local and the global, the stationary and the mobile, the private and the public, the commercial and the amateur, work and play in countless ways. This Web 2.0 environment provides innovative business opportunities for media companies, telecommunications and IT industries.
The open architecture of the Internet and Internet standards enable large scale interoperability and globalisation of services and applications. Developments in hardware influence the creative content industries by improving the connectivity and performance of distribution channels. The newly emerging Web 2.0 technologies embrace advances in client-server communication and facilitate the use of services on the Web as extensions to the personal computer. Advances in programming tools, social software, and easy-touse and inexpensive tools for content creation have enabled new media forms (like blogs and wikis) and new communication and co-operation forms like virtual communities. Thanks to Web 2.0 technologies users can better control their media consumption: The Desktop and the Browser have turned into the user's powerful media content control centre allowing the personalisation of networked media experience.
One class of new media to emerge from Web 2.0 environments are so called User Generated Content (UGC)-platforms. A typical example is Flickr, a platform to share personal photographs. These new media are far from being non-profit. Although UGC is intuitively associated with a certain amount of creative effort of users, the term covers a whole range of input from users, even including the involuntary production of commercially exploited data traces.
User Generated Content in principle caters for a niche market, in which each separate niche covers customers or audiences with specialised demands. UGC-Platforms serve numerous niches. By providing an infrastructure for the aggregation and presentation of content serving niches these platforms become a new form of networked media. The role of the user is not only to upload original content, but also to act as broker between supply and demand by tagging, recommending etc. Content on UGC-Platforms often stems from users and the media industries. UGC platforms currently serve as an exchange where amateur content can qualify for commercial exploitation in mass media, and commercial content is offered for free to regain attention. They can also be understood as "free access markets" created by companies, where neither "buyer" nor "seller" pay for market entry.
Some markets have been heavily impacted by the rise of user generated content services. The most notable examples are encyclopaedias, the online adult industry and the market for music videos. In other markets UGC complements the existing supply. All major media companies are in the process of setting up UGC services or taking over successful grass roots initiatives, witness the example of the acquisition of the YouTube video platform by Google.
There are clear indicators that the Internet as Internet of media is turning into the growth motor of the media and entertainment industries. Traditional media migrating to the Internet are able to compensate for lower growth rates or losses in the physical world, but digital born content like video games is showing the highest growth rates. Forecasts see EU27 members from Southern and Eastern Europe as most dynamic in the region. International comparison of media companies shows the importance of US-based global players (Google, MSN, Yahoo), but also the strength of national actors often belonging to incumbent media industries in Europe.
The audio-visual sector is being faced with upheavals due to digitisation. On-demand viewing is likely to be driven by TV-based platforms including services delivered over the Internet rather than public Internet platforms. Radio broadcasting traditionally has a very strong regional element. Due to uncertainty on future standards, existing broadcasters are currently showing little interest in the transition to digital. The adoption of on-line radio is slow due to lack of adequate affordable broadband access in parts of Europe and the slow diffusion of suitable listening devices.
Previously viewed as a slightly disreputable segment of the toy industry, online video games are a rapidly growing segment of the mainstream media and entertainment sector with huge business opportunities. Web-based and mobile online video games are turning the games sector into a distinctive type of networked electronic media. Online video games have thus the potential to become mass media for everyone – not only the typical young male computer nerds, but also for women and older people. Online games are competing with other mass entertainment media such as TV and movies, and also with device dependent games (e.g. consoles).
The Internet enables efficient distribution platforms for online games using the typical approaches to profitable business on the Internet, such as subscription models, micropayment/ advertising, and indirect revenue streams. The most important segment of online games in economic terms are still Massive Multi-Player Online Games (MMOs) and within this category Massive Multi-Player Online Role-Playing Games (MMORPGs).
The fastest growing segment within the online game segment is probably casual games with over 200 million people playing online casual games every month – both downloadable and browser games. Most mobile phone games can be regarded as a sub-section of the casual games section.
A future vision for the gaming sector are "pervasive games" which will extend the gaming experience out into the physical world. The gaming experience of tomorrow may use your home city street as a playground with everyday life co-existing side by side with virtual elements.
While there are figures indicating that the importance of data services in Europe is growing. the content industries argue that the pace of change is too slow due to lack of support for Internet standards on the part of European operators. The process could be accelerated by a shift from expensive SMS to cheap E-mails with links to sites on the Internet. This could be supported by policy measures boosting increased competition based on lessons already learned in Japan, which is the world leader on the mobile Internet.
Major proposals in this area include a Europe-wide provision of wireless Internet services allowing for internet telephony (Voice over IP) without roaming fees as well as a European spectrum regulation beneficial for the content industries. Radio spectrum policy could provide support by the means of long-range unlicensed spectrum, the provision of pan-European licenses, the provision of licenses to new competitors, and the enforcement of technology neutrality with regard to radio technologies. Such spectrum policy could in particular focus on re-using "beachfront" TV-spectrum (spectrum in the range of 700 megahertz).
A working group composed not only of the incumbent spectrum owners, but also those parties who would benefit from the new approaches, could be set up to elaborate these options as a first step.
Up to now, educational uses of games have included rather massive simulation-type games, but also single-player "mini games" of the casual type available online from java and flash game portals.
Based on the evidence of time spent by users, online games seem to be over all more attractive and compelling than educational software which tends to be put aside after a few hours. This could be due to less appealing graphics and content which fails to sustain interest.
On the one hand educational software can learn from online games, on the other hand it might be in the public interest if game developers were willing to integrate educational elements in games primarily intended for entertainment.
The deliberate incorporation of features of "edutainment" in mainstream teaching of all subjects, and not only computer skills, requires dissemination of "best practice" to avoid costly mistakes with applications which do not capture pupils’ interest and engagement or are unsuccessful in achieving their educational goals. The utilization of edutainment features in teaching requires an adequate infrastructure for schools throughout Europe to provide equal access to the benefits of computer-based teaching methods, including online access and sufficiently fast and powerful computers. There might be greater potential for the use of educational software outside compulsory education, such as further education and adult education.
To make optimum use of the potential benefits provided by educational programmes with educational elements, it would be useful to develop Europe-wide recommendations on uses and applications for various levels and types of school. As a measure to diffuse and optimise the utilisation of networked electronic media for educational purposes, a start could be made by inventorising such media and by creating platforms for the exchange of experience at the European level.
Much of the past discussion on payment schemes for the Internet has focused on systems for the payment of sums too small for other systems like credit cards. While it is true that there is still a lack of interoperability, of cross-border standards, and of a common infrastructure for dedicated micropayment schemes, the demand for such systems has decreased. Current demand is not sufficient to push micropayment systems any further, so there is no need for policy to intervene.
Even so, existing schemes do not support a wide enough variety of content, payments to small content creators or person to person payments sufficiently. In the medium term, current interpersonal payment systems may develop towards more cash-like P2P payments. There is a need for policy to monitor these developments, to analyse the low value payment issue from a societal perspective and to reconsider the regulation of prepaid low value payment schemes.
Digital-Rights-Management technologies can be understood as computer technologies supporting copyright policies. There is currently a shift from DRM-technology as "containment" of content to forensic DRM, meaning technologies identifying, tracking and tracing content (and maybe persons associated). Digital watermarks and acoustic fingerprints are maybe the most prominent technologies. Content on Web 2.0 is increasingly being sold free of restrictive DRM technology, but use forensic technology to control circulation of copies and curb infringement of copyright and also as a means to avoid uploading of unauthorized copies, since access providers are expected to apply this type of forensic DRM technology for "filtering".
The ability to track files through forensic DRM obviously raises privacy concerns. Additionally, the technology is not sufficiently reliable to identify all infringements or to distinguish true from suspected infringements. Forensic DRM does also not solve the problem that certain uses under copyright law are perfectly legal and human moderation and judgement will stay essential to make a distinction between illegal and legal use and identify exceptions in copyright law. Regulation might be required to control the fair use of such technologies.
The time, effort and money necessary to enforce copyright policies for large amounts of content of interest to even the smallest minority have led to proposals to leave content with only a small audience uncontrolled and to define a threshold for when it is worth controlling.
As Web 2.0 facilities and practices enable and encourage everyone connected to the Internet to engage in content production and dissemination and to re-combine and reconfigure existing media content, "transformative uses" are on the rise bringing with them the question of how to handle related copyright issues. Proposals to go for a wider spectrum of legitimate "transformative uses" could be supported in practice by Creative Commons Licences as an open way of protecting copyrights while granting more possibilities for free use, reuse and transformative uses of copyright protected materials.
Many web services today base their business models on revenues from advertising. The service provider furnishes the platform and facilities for its use, the users produce content, content generates traffic, and traffic attracts advertising revenue for the service.
The user's social capital in Web 2.0 environments consists of three value sources: personal profile and contacts, content contributions and data traces. This implies a risk of "triple exploitation". The involvement of prosumers in the value chain of Internet media requires further reflection on adequate compensation, fair revenue sharing, and protection of the users' privacy.
Although the information flowing across Web 2.0 can be used to personalize advertising, and content on UGC-Platforms and to help search engine providers know more about user needs with the aim of delivering more relevant and meaningful results, large scale monitoring and aggregation of users’ online personal and intellectual activities brings with it threats to privacy.
The increasing importance of advertising may also have effects on professional journalism. On the Internet ads can be fine-tuned in correspondence with the content of e.g. an article. Articles which are closer to products and services are more likely to be supported by advertising, while well done articles about nasty realities are less likely to attract advertising and generate less advertising revenues. This link between content and ads may in the long run decrease the demand for critical journalism and diminish its overall quality.
Despite the "disappearance" of space and time due to information and communication technologies, stimulating environments and infrastructures favour the local concentration of activities of the "creative class", recently discussed as an important factor for regional development. Although it is maintained that environments still play a major role, the role of the local infrastructures is diminishing through Web 2.0 tools for production, diffusion and distribution and in the end there might be a decrease in the importance of location as an environment. This is an issue well worth examining when devising policies seeking to boost local development.
The ability of Web 2.0 technologies to generate new media products automatically or semi-automatically is likely to have impact on the labour market. Software programmes can build on the expanding universe of original content, to which User Generated Content and other content made freely available (e.g. public domain, public sector information) significantly contribute. Such programmes can bundle and personalize content from the net. Search engines and all other machines which harvest and re-purpose content from the Internet are instances of this development. Secondary media, as those media which do not build on proprietary content are known, have no need for authors. What they do need are programmers. In principle personalized radio or TV stations like peer-to-peer filesharing networks or UGC Platforms can operate worldwide offering personalized services with very few employees really producing, editing and enriching content.
a) Semantic technologies already here to stay
Semantic technology makes it possible to add meaning to mere data and content. Technologies enabling the realisation of the vision of the "semantic web" have achieved sufficient maturity for regular use in parts of the media industries and in other communities. Important standards and specifications are already in place. There is a broad range of research in this field funded by national governments and the European Commission. Continued effort is needed to keep Europe competitive in the field and in the lead in selected areas. Improvements here can be very useful for media companies.
Semantic technologies are finding use in the automatic production of secondary media. The potential for Internet radio stations or movie channels to better customize their services with a minimum of personnel has already been pointed out. The next step would be the autonomous generation of new knowledge from what is already there.
The principle of harvesting and repurposing existing content raises crucial questions of copyright and digital rights management. The knowledge semantic search engines collect about persons, their behaviour and their preferences might turn into a nightmare for privacy if not monitored and regulated appropriately by politics. A further more subtle effect of semantic technologies worth investigation is the delegation of knowledge work to machines (intelligent agents), the trustworthiness of these agents, and a potential loss of citizens' "informational autonomy".
b) What about Google? – Semantic search engines and other powerful visions
Semantic web technologies have already proved useful as a navigation and search interface to databases on certain web sites, but a global semantic search engine is far from reality given the billions of existing documents without semantic annotations. In view of the massive activities of users tagging and rating content on Web 2.0, the "semantic web" vision has been modified and labelled Web 3.0 to imply the convergence of Web 2.0 and semantic technologies.
The semantic web community is of course aware that computerized mechanisms to extract semantic information from text and multimedia documents are required to make further progress towards the semantic web. They may have underestimated the potential of other approaches to improve Internet search by analyzing user behaviour, mining the existing Web and mining the semantic web.
It is likely that there is no one best way to realise the vision, with an optimised synthesis of the different strands of search improvements as the basis of the next generation of Internet search engines and it would be surprising to find that Google had missed out.
The most ambitious vision of the semantic web envisages personalized intelligent software agents, which not only answer natural language questions, but also perform tasks for users. The concept of the "Internet of services" sees the Internet as a huge network of applications able to perform tasks based on requests by users. The idea of having an artificial agent that can reliably search for information is still largely a vision rather than a reality, but the Internet of Services is a rather likely long term trend of Internet development.
c) The Semantic web will ease in gradually
In view of the state of semantic web technologies and considering the far reaching visions, the "semantic web" can not be conceptualized in isolation from other trends and is not a technology as disruptive as some protagonists claimed some years ago. The gradual enrichment of the WordWideWeb with semantics now appears as an evolutionary process linked with other developments. We have seen that bridges between the "syntactic web" (HTML, XML etc.) and the semantic Web are required and that there is no clear break where information about structure ends, and semantics begin. User involvement is crucial for the development of the semantic web, because without user involvement the billions of documents will never be semantically annotated and human intervention is needed as a corrective to automated processing of meaning. Last not least, further improvements of search engines will not only depend on semantic descriptions based on ontologies. There are many more approaches to improving Web searches, based on web mining, "semantic web mining", "observational metadata", or "similarity detection". Improved search engines are likely to unite the best features of all approaches.
As others, like the OECD, have pointed out before, there is a lack of sound statistics and reliable surveys about the new sector of networked electronic media. Available data are often restricted to the United States. There is no such thing as a European Networked Media Observatory. Little is known about how media consumption and behaviour is changing, and there is still no economic measurement of the "networked electronic media sector", in which many industries and many actors are jointly generating economic value. On one side, as technology companies (Telcos, ISP, Internet companies) are tending to move up the value chain towards content, the convergence of providers needs to be taken into account. On the other side, the broader creative content sector consisting of amateur, semi-professional and professional producers, deserves more attention in media statistics and research.
The present report "Looking Forward in the ICT & Media Industries" is the final deliverable of the STOA project "Looking forward in the ICT and media industry – technological and market developments". It is based on research by the project team at ITAS (Institut für Technikfolgenabschätzung und Systemanalyse, Karlsruhe), a member of ETAG, the European Technology Assessment Group. In addition to desk research the report draws on communications with experts, who commented on an earlier version, and insights from the workshop "ICT & Media Industries in the Times of Web 2.0", which took place in the European Parliament on June 26 2008.
The thematic focus of the report is on "networked electronic media" and particularly on Web 2.0 and User Generated Content (UGC) developments. The report combines descriptive stock-taking efforts with more in depth analysis of selected policy relevant issues and an assessment of some future oriented visions of ICT and media development.