To understand better linked open services, it is best to discuss linked open data first. Simply put, linked open data is data that is connected in an open content. To be able to distinguish linked open data and linked data, Tim Berners-Lee provides a crystal clear differentiation on these terms. According to him, both linked data and linked open data are identified with its four mechanisms and are set apart by the fifth component which is open content which is the main characteristic of linked open data. Examples of these large linked open data sets are DBpedia and Freebase.
True to its term and form, the phrase “linked open data” has been utilized since February 2007. This was at the time that “Linking Open Data” mailing list was done. Initially, this mailing list was initiated by the SIMILE Project at the Massachusetts Institute of Technology (MIT). From then on, the linking open data community project had been spearheaded. The objective of the W3C Semantic Web Education and Outreach group’s Linking Open Data community project is to expand the Web with data commons. This can be done through publishing different open data sets as RDF on the Web and through putting in RDF links between data items from varying data sources. These data sets are composed of over two billion RDF triples which were interrelated through over two million RDF links back in October 2007. Four years after in 2011, this had extended to 31 billion RDF triples interlinked by around 504 million RDF links. Published in 2014, a detailed statistical breakdown was done.
Meanwhile, a lot of European Union projects including linked data. This involves the linked open data around the clock (LATC) project, the DaPaaS (Data-and-Platform-as-a-Service) project, the Linked Open Data 2 (LOD2) project and the DaPaaS (Data-and-Platform-as-a-Service) project. Data linking is one of the primary goals of the EU Open Data Portal. This makes it available to a number of data sets for anyone to recycle and link. Speaking of data sets, there are numerous data sets that that one could be acquainted with. The most popular database so far is DBpedia which contains extracted data from Wikipedia. It contains an estimated 3.4 million concepts described via 1 billion triples. This is already inclusive of 11 different languages. Speaking of Wikipedia, Wikidata is a shared and created linked dataset that perform as an essential storage for the structured data of its Wikimedia sister projects. Apart from these datasets, FOAF is distinguished through describing relationships, persons and properties. Lastly, GeoNames gives descriptions RDF of more than 7,500,000 geographical features around the world.
This is exactly how linked open data works. Moving forward, linked open services is a fundamental evaluation to compose, describe and produce services on the semantic web. What is actually more favored in this kind of technology is that it emphasizes linked data like RDF, HTTP and SPARQL rather than inflicting rule languages like WSDL and SOAP. Having said that, linked open services are on a straightforward set of principles to be discovered, sustained and refined in an open, community-like way.