OGC Future Technologies
Several issues arise around the discovery and use of new technologies
A widely shared feeling among many OGC members is that the OGC should develop a common position on the use of RESTful design elements in its standards. Several efforts have been started but a common position has not yet emerged. There is nonetheless a wide need for OGC members to better understand the use of HTTP for web services, issues such as the correct design of web services to ensure optimal caching of persistent resources.
Several OGC web services now have new interfaces that integrate ideas from REST for certain exchanges. The membership is divided over the relation of that work to the RESTful design principles.
A suggestion has been made that the OGC explictly fund, as part of OGC Testbeds, several alternative efforts at developing simple RESTful services in order to see
first, this should teach a lesson to the RESTful SWG: it was a good move to establish it, but if you don't come to conclusions for a couple of years, reality will bypass you. Second, a concertation effort on RESTful testing, using the OWS vehicle or similar, sounds like a great idea. Third, in the WCS REST spec draft we have tried to marry REST with KVP, leading to interesting new ways wrt. security mechanisms: https://portal.opengeospatial.org/files/?artifact_id=51832 .
- PRO - The explicit commitment would ensure that some progress was being made.
- PRO - This is exactly the kind of thing that a testbed should be exploring.
- CON - The direct use of OGC resources is an unusual step for the OGC to take, especially on an unproven, new designs.
- CON - The lack of consensus despite serious effort suggests that this goal is too vague to work on its own.
- 30 Jun 2013
We are stuck in low-level issues. The world, in particular with the advent of Big Data, has moved on. APIs are dead, what we need next is high-level query languages, based on the eminently successful example of SQL. Such query languages (we have good first steps, BTW: FE, WCPS) allow (i) good machine-machine communication as there is a clear syntax and (transcending WPS) semantics; (ii) wrapping this into visual interfaces for human users; (iii) allow versatile server-side optimization and parallelization. Let's leave REST, JSON, etc to the code monkeys and think about users now.
- 30 Jun 2013
The JSON format is extremely popular on the web. JSON however is just a data structure. The interoperable use of JSON requires the full definition of allowed data structures which requires the definition of a schema for those data structures. Traditionally, this has been done at the OGC using XML. Therefore, the use of JSON may duplicate much of the work done on XML leading either to a situation where implementations are required to support both XML and JSON, thereby increasing the complexity of implementation, and/or to a situation where implementations fragment between those supporting XML and those supporting JSON, reducing interoperability.
A suggestion has been made that the OGC should develop a clear vision of the role of JSON in OGC standards either to be used in the same way as XML with the concommitant issues of implemenation complexity or fragmentation, or to be restricted in some way to only simple data structures.
the next hype. To me it is just another protocol binding for W*Ss, next to KVP, POX, SOAP, and REST. An internal c/s interface, not more, not less. If we modularize (!) specs accordingly this will not pose substantial complexity. WCS is demonstrating this.
- PRO - This would clarify the role of JSON as a complement or as an alternative to text formats like KVP or to fully namespaced, schematized formats like XML.
- CON - It might not be possible to reach any concensus on this issue.
- CON - A policy limitation of JSON to simple data structures would artificially limit what is a technological or market issue.
- CON - The OGC has used UML as the abstract data structure and then implemented the UML in XML, this could be done equally well with JSON.
- 30 Jun 2013
format is quite popular on the web. The OGC may want to adopt the format. However, while the format could be quite useful for OGC services, the OGC cannot adopt GeoJSON without modification because GeoJSON has an ambiguous axis order policy which conflicts with OGC rules regarding axis orders. That ambiguity makes the format unsuitable for use as a general purpose spatial data format. In order to use GeoJSON, the OGC would have to develop a profile which would
- require GeoJSON processors to check the CRS and adopt processing based on the manifold of the coordinate system, and
- restrict the allowed CRS to CRSs with axes which can be unambigously mapped to 'east' and 'west'.
The best policy for the GeoJSON profile would probably be to limit the use to the two CRS used in the vast majority of the time: CRS:84 and Google Spherical Mercator.
Additionally or alternatively, the OGC could develop a GeoJSON like format for general use that followed the same rules as GeoJSON but simply changed the axis order policy to be that the order of ordinates in coordinate tupeles follows the order of the axes in the CRS. When used with the two CRS named above, this format would be identical to GeoJSON but the format could additionally be used with all other CRSs.
Seems the best way forward for the community at large is: team up with the GeoJSON developers, explain the obvious shortcomings, collaborate on improvements. Coupling axis order to the CRS makes a lot of sense and is in coherence with the OGC Name Type Definition BP, so clearly defined = good to explain and to implement; OGC members might implement JS snippets performing CRS inspection (OWS?). All other alternatives (competing spec, restrictions to 2D, ...) IMHO will fail on the long run for diverse reasons.
- 30 Jun 2013
Several suggestions have been made that the OGC should use external, online resources for its work.
A suggestion has been made that the OGC should develop its standards on GitHub
I do not believe in this - on the contrary, this confuses, diverges, and ultimately kills the standardization idea. By definition a standard must favor a specific approach over all others. Does software get interoperable by putting it on GitHub? ;-)
- PRO - The standard could be changed and modified ('forked') by anyone for their own purpose allowing for flexible changes.
- PRO - The standard could be developed online.
- CON - The whole purpose of having a standard is having a canonical document which changes only in carefully defined ways.
- CON - The distribution of OGC standards across many locations would cause confusion.
- CON - The reliance on external services adds the risk that OGC critical resources could be lost at any time due to changes in policy of that resource.
- CON - Multiple incompatible deriviatives of the standard would cause interoperability issues.
- CON - Standards could be developed in multiple, incompatible, non-interoperable formats.
- CON - The text of the standard can already be taken and modified by anyone.
- CON - The suggestion does not explain the ultimate purpose of this proposal, such as to use a distributed versionning system, to use the markdown or html format, or for another reason.
- MIX - Developing internal SWG documents on public resources violates OGC TC rules on confidentiality.
- 30 Jun 2013
I view this as a good out-reach effort to the developer community. While I respect the above feedback on confidentiality, there are a couple places where putting work on GitHub
- Completed standards: Putting the result on GitHub
would allow vendors to fork
the original in order to document their vendor extensions to a standard.
- Ad-hoc standards: collaborating on border line cases where a full standard may not be required (such as GeoJSON
- Engineering Reports: these are aimed a bit more for the developer target audience containing cold-hard implementation advise. Putting that level of content in GitHub
as the perfect answer to Chris Homes rant on GeoPackage
In all cases by using GitHub
the OGC could track what ways the standard gets forked and view it as a feature request for the next iteration. There is no requirement to accept pull-reqeusts after all if you cannot sort out the IP consequences of contributions.
Aside: It can be done on the IP side of things - The Eclipse Foundation (operates in a very locked down manner which is extremely IP conscious) has recently sorted out a set of procedures to allow projects to function on GitHub
while still respecting IP diligence: http://mmilinkov.wordpress.com/2013/06/20/embracing-social-coding-at-eclipse/
- 01 Jul 2013