Semantic Sensor Web
From Wikipedia, the free encyclopedia
To comply with Wikipedia's lead section guidelines, the introduction of this article may need to be rewritten. Please discuss this issue on the talk page and read the layout guide to make sure the section will be inclusive of all essential details. |
This article is orphaned as few or no other articles link to it. Please help introduce links in articles on related topics. (June 2008) |
Sensors are distributed across the globe leading to an avalanche of data about our environment. The rapid development and deployment of sensor technology involves many different types of sensors, both remote and in situ, with such diverse capabilities as range, modality, and maneuverability. It is possible today to utilize networks of sensors to detect and identify a multitude of observations, from simple phenomena to complex events and situations. The lack of integration and communication between these networks, however, often isolates important data streams and intensifies the existing problem of too much data and not enough knowledge. With a view to addressing this problem, the semantic sensor Web (SSW) proposes that sensor data be annotated with semantic metadata that will both increase interoperability and provide contextual information essential for situational knowledge. In particular, the SSW presents an approach for annotating sensor data with spatial, temporal, and thematic semantic metadata. This approach leverages current standardization efforts of the Open Geospatial Consortium (OGC) [1] and Semantic Web Activity of the World Wide Web Consortium (W3C) [2] to provide enhanced descriptions and meaning to sensor data.
Contents |
[edit] Semantics of Sensors - within Space, Time, and Theme
The encodings of observed phenomena by sensors are by nature opaque (often in binary or proprietary formats), therefore metadata play an essential role in managing sensor data. A semantically rich sensor network would provide spatial, temporal, and thematic (STT) information essential for discovery and analysis of sensor data. Spatial metadata provide information regarding the location of sensors and sensor data, in terms of either a geographical reference system (GPS), local reference, or named location. Local reference is especially useful when a sensor is attached to a moving object such as a car or airplane. While the location of the sensor is constantly changing, its location can be statically determined relative to the moving object. In addition, data from remote sensors, such as video and images from cameras and satellites, require complex spatial models to represent the field of view being monitored, which is distinct from the location of the sensor. Temporal metadata provide information regarding the time instant or interval when the sensor data is captured. Thematic metadata describe a real-world state from sensor observations, such as objects or events. Every discipline contains unique domain specific information, such as concepts describing weather phenomena, structural integrity values of buildings, and biomedical events representing patient health status. Thematic metadata can be created or derived by several means, such as sensor data analysis, extraction of textual descriptions, or social tagging.
Whereas the languages provided by the OGC Sensor Web Enablement [3] provide annotations for simple spatial and temporal concepts such as spatial coordinate and time stamp, more abstract concepts, such as spatial region, temporal interval, or any domain-specific thematic entity, would benefit from the expressiveness of an ontological representation. Consider, for example, the semantics of a query about weather information at a particular time and place. The type of weather condition being sought could be a simple phenomenon such as a single temperature reading or a complex phenomenon such as a tsunami. The type of location within the query could be a single coordinate location, a spatial region within a bounding-box, or a named location such as park or school. The semantics of the time interval specified by the query could be about weather conditions that fall within the time interval, contain the time interval, or overlap the time interval. The type of metadata necessary to answer the queries listed above requires knowledge of the situation being observed by the sensors. Such knowledge can be represented in ontologies and used to annotate and reason over sensor data to answer complex queries as seen above.
[edit] Defining the Semantic Sensor Web
The Semantic Sensor Web (SSW) is a framework for providing enhanced meaning for sensor observations in order to enable situation awareness. It is accomplished by adding semantic annotations to the existing standard sensor languages of the Sensor Web Enablement. The semantic annotations provide more meaningful descriptions and enhanced access to sensor data than SWE alone. These annotations act as a linking mechanism to bridge the gap between the primarily syntactic XML-based metadata standards of the SWE and the RDF/OWL-based metadata standards of the Semantic Web. In association with semantic annotation, ontologies and rules play an important role in SSW for interoperability, analysis and reasoning over heterogeneous multi-modal sensor data.
Semantic Annotation
Many languages may be used for semantic annotation of sensor data, such as RDFa, XLink, and SAWSDL [4]. In this section, we describe the use of RDFa. A W3C proposed standard, RDFa [5] is a markup language that enables the layering of RDF (Resource Description Framework) information on any XHTML or XML document and is currently used for embedding semantic annotations RDFa provides a set of attributes that can represent semantic metadata within an XML language from which RDF triples may be extracted using a simple mapping. The core subset of RDFa attributes include:
- about – A URI specifying the resource the metadata is about. Extracted as the subject of an RDF triple.
- rel and rev – Specifies a relationship or reverse-relationship with another resource. Extracted as the object property (predicate) of an RDF triple.
- href, src and resource – A URI specifying the partner resource. Extracted as the object of an RDF triple.
- property – Specifies a property for the content of an element. Extracted as the data-type property (predicate) of an RDF triple.
- instanceof – Optional attribute that specifies the RDF type of the subject (the resource that the metadata is about). Extracted as the object property “rdf:type” coupled with the object of an RDF triple.
The following example shows a timestamp encoded in Observations & Measurements and semantically annotated with RDFa. The semantic annotation of the timestamp describes an instance of time:Instant (time here is the namespace for an OWL-Time ontology).
<swe:component rdfa:about=”time_1” rdfa:intanceof=”time:Instant”>
<swe:Time rdfa:property=”xs:date-time”>2008-03-08T05:00:00</swe:Time>
</swe:component>
The above example generates two RDF triples:
1. time_1 rdf:type time:Instant
2. time_1 xs:date-time “2008-03-08T05:00:00”
The first RDF triple describes time_1 as an instance of time:Instant (subject is time_1, predicate is rdf:type, object is time:Instant). The second describes a data-type property of time_1 specifying the time as a literal value (subject is time_1, predicate is xs:date-time, object is “2008-03-08T05:00:00”). This example illustrates the simple mechanics of embedding semantics in an XML document using RDFa. The semantic annotation of SWE languages enables software applications to “understand” and reason over sensor data consistently, coherently, and accurately.
Ontologies
An ontology is a formal representation of a domain, composed of concepts and named relationships. We envision that the SSW will be adopted by a diverse set of domains, and therefore benefit from a collection of ontologies to model each domain. At a broad level, we can classify the ontologies along the three types of semantics associated with sensor data – spatial, temporal, and thematic – in addition to ontological model(s) representing the sensors domain. At present, there are several ongoing initiatives to build relevant ontologies within various communities, such as the National Institute of Standards and Technology (NIST) [6], the World Wide Web Consortioum (W3C), and the Open Geospatial Consortium (OGC). NIST has initiated a project titled Sensor Standards Harmonization in order to develop a common sensor ontology based on the existing standards within the sensors domain, including IEEE 1451, ANSI N42.42, CBRN Data Model, and the OGC Sensor Web Enablement languages. There are also several efforts to design an expressive geospatial ontology, including the W3C Geospatial Incubator Group (GeoXG) [7] and the Geographic Markup Language (GML) Ontology [8] of the OGC. OWL-Time [9], a W3C-recommended ontology based on temporal calculus, provides descriptions of temporal concepts such as instant and interval, which supports defining interval queries such as within, contains, and overlaps. Domain-specific ontologies that model various sensor-related fields such as weather and oceanography [10] are also needed to provide semantic descriptions of thematic entities.
Rule-based Reasoning
To derive additional knowledge from semantically annotated sensor data, it is necessary to define and use rules. Rule languages and rule processing systems are evolving (e.g., RuleML and ongoing work W3C work on Rule Interchange Format - RIF). To demonstrate the application of rules and rule base reasoning, you can currently use SWRL-based rules defined over OWL ontologies to deduce new ontological assertions from known instances. Logic rules are used to infer additional knowledge from existing information through the use of reasoning. The Semantic Web Rule Language (SWRL) [11] is a proposal to the W3C as a standard rule language in the semantic Web. SWRL is based on the Web Ontology Language (OWL) and uses the antecedent → consequent structure to define rules. The primary advantage of SWRL is that it seamlessly incorporates rules into an OWL ontology schema (with extended OWL model-theoretic formal semantics) while providing enhanced application-specific expressivity.
[edit] OGC Sensor Web Enablement
The Sensor Web is a special type of Web-centric information infrastructure for collecting, modeling, storing, retrieving, sharing, manipulating, analyzing, and visualizing information about sensors and sensor observations of phenomena.1 The OGC, an international consortium of industry, academic, and government organizations tasked with developing open geospatial standards, describes the Sensor Web as Web-accessible sensor networks and archived sensor data that can be discovered and accessed using standard protocols and application program interfaces. The Sensor Web has vast significance for applications utilizing sensor technologies in order to attain actionable situation awareness. Lack of standardization, however, is the primary barrier to the realization of a progressive Sensor Web. The OGC recently established Sensor Web Enablement (SWE) in order to address this aim by developing a suite of specifications related to sensors, sensor data models, and sensor Web services that will enable sensors to be accessible and controllable via the Web. The core suite of language and service interface specifications includes the following:
- Observations & Measurements (O&M) – Standard models and XML schema for encoding observations and measurements from a sensor, both archived and real time.
- Sensor Model Language (SensorML) – Standard models and XML schema for describing sensors systems and processes; provides information needed for discovery of sensors, location of sensor observations, processing of low-level sensor observations, and listing of taskable properties.
- Transducer Model Language (TransducerML) – The conceptual model and XML schema for describing transducers and supporting real-time streaming of data to and from sensor systems.
- Sensor Observation Service (SOS) – Standard Web service interface for requesting, filtering, and retrieving observations and sensor system information. This is the intermediary between a client and an observation repository or near real-time sensor channel.
- Sensor Planning Service (SPS) – Standard Web service interface for requesting user-driven acquisitions and observations. This is the intermediary between a client and a sensor collection management environment.
- Sensor Alert Service (SAS) – Standard Web service interface for publishing and subscribing to alerts from sensors.
- Web Notification Services (WNS) – Standard Web service interface for asynchronous delivery of messages or alerts from SAS and SPS Web services and other elements of service workflows.
[edit] See Also
- Knoesis Center — A research center focusing on realizing a knowledge society that utilizes semantics and services as key enablers.
- Semantic Sensor Web Project @ Knoesis Center — Research project at Kno.e.sis Center bringing semantics to sensor networks.