Link Search Menu Expand Document

ViRGiS

Virtual Reality GIS Platform

ViRGIS Version 1

Scripting Reference

Table of contents
  1. ​Project Schema
  2. ​Entity Data Schema & Object Schema
  3. ​Data Ingestion
  4. Georeference Framework
  5. ​3D Geometry Tools
  6. ​Event System
  7. ​User Interface
    1. ​Guidelines
    2. ​Design Decisions
  8. ​Edit Session
  9. Notes

​Project Schema

The ViRGIS project schema is a custom JSON schema developed based upon components of the following standards and pseudo-standards:

The Full Definition is Shown in the Scripting Refernce

​Entity Data Schema & Object Schema

ViRGIS App v1 will support the following types:

Layer TypeFeature TypesFormatsFeature Contents
PointPoint, MultipointGeoJSONDatapoint GO
LineLinestring, MultiLinestringGeoJSONVertex GOs and Line Segment GOs
PolygonPolygon, MultiPolygonGeoJSONVertex GOs, Line Segment GOs and a Mesh body.
MeshMeshOBJ, OFF, STL, .3DSMesh
PointCloudPoint CloudPLYParticle System
MapRaster TileMapboxTilesystem
TerrainRaster TileMapboxMesh
MapVector TileMapboxTilesystem

​Data Ingestion

In ViRGIS App V1, data ingestion will be performed by a set of dedicated IO drivers. For the JSON based formats, these are based upon the Newtonsoft JSON.NET library1 using the GeoJSON.NET2 type definitions to address GeoJSON objects. For the meshed based formats the geometry3Sharp library is used to create meshes from the raw data and for Point Clouds, a custom script reads the PLY and creates a Particle System.

ToDo add PCX

Georeference Framework

The core of the GIS system is the Georeference framework that provides the basis for mapping from Real-World coordinates to VR-World coordinates in a zoomable map.

The ViRGIS Georeference Framework is based upon the Mapbox Unity SDK3.

​3D Geometry Tools

ViRGIS App V1 includes an implementation of geometry3Sharp4 to provide comprehensive and advanced 3D geometry tools.

As well as the usual geometry tools, this library has a comprehensive set of tools for 3D Mesh manipulation.

A key part of the integration of this toolkit into Unity and Mapbox is the marshalling of multiple data types across the three libraries. See Appendix 2 for details.

​Event System

In V1, the entity model maintains a basic event model using messaging up and down the entity tree. When an entity triggers an event (which is usually a leaf node - but not necessarily), that entity uses SendMessage to send the event to all its ancestors. When an entity receives an event, it can use BroadcastMessage to broadcast the event to all descendants, based upon the current state of the entity.

This relatively simple model preserves the entity structure into the event structure without needing any further configuration. The basic idea can be shown by the Selected event that is triggered on a component by the UI. This event is propagated up so that the Feature and Layer that this component is part of know that the event has occurred. If the feature is a line and is in blockEdit mode it will broadcast the Selected event to all of the components of the line.

There are following prototypes in the entity model:

  1. IVirgisEntity. Covers all entities in the Virgis Entity Model
  2. IVirgisFeature. Covers all visible features and parts of Features (like lines)
  3. IVirgisLayer. Covers all GIS Layers in the model. Note that a GIS Layer in not the same thing as a Unity Layer. GIS layers are collections of like data. Unity Layers are used by Unity to categorise GameObjects. There is no connection between these concepts.

There is also an event system called AppState that is used to communicate changes in the application state. This is explained further in the Scripting Reference

​User Interface

The Overall VR Interface architecture is described in this article

​Guidelines

  1. This is NOT a game. This is an environment the user is exploring. The User’s representation in the space is not a character. It does not need a body or (much) physics (although a bit of physics about how the user moves always make it easier on our brains). The User’s representation in the space is a camera or probe that we use to explore the space.
  2. The User is NOT the centre of the space. The data is the centre of the space. This is different from many mapping applications on the web and in games - where the space reveals itself almost infinitely as the User moves. This is GIS and in GIS - the data has bounds (called the “extent” of the data). We will use that paradigm. We move around in that extent.
  3. We have two eyes and two hands. Therefore - the User Avatar has two cameras and two representations of hands. These are provide by the XR Rig (currently OVR XR Rig). The Game Space is set up with a putative scale of 1m in real-world to one Unity unit. But there is also internal scaling (ie. zoom factors) in the map that changes this. The space can be zoomed in-game.
  4. Real users are going to demand the ability to look from multiple viewpoints without the effort of moving e.g “in close” to change things “wide outside” view to get the overview - or alternate angles to understand parallax effects.
  5. The hands are used to control things.
    1. Select and manipulate,
    2. Move the character in the model,
    3. Move and scale the model,
    4. Change the state of the model and of the avatar.

​Design Decisions

  1. From Guideline 1 all avatars are only First Person view.
  2. From Guideline 4 suggests that an approach will be to have a main avatar and one or more drones to get alternate views.
  3. From Guideline 5:
    1. We use a ray pointer interactor oon the left hand to allow the selection and manipulation of data entities,
    2. We use the controls on the right hand to move and control the avatatr and manipulate entities,
    3. We use the controls on the left hand to zoom and rotate the model, and
    4. There is an interactive menu attached to the left hand to allow the application state to be changed.

Movement of the avatar is by using a 3D “jet-pack” analog.

​Edit Session

This concept is common in GIS software. The basic concept is that changes to geospatial data are complicated and have many interdependencies - which means that a simple “undo” function can get you into more problems than solutions! Therefore, the usual concept is, effectively, to create a simple type of “checkpoint” at the start of an Edit Session and to always be able to get back to the “checkpoint”. The “checkpoint” is always what is currently saved in the data file.

Notes

  1. JSON.NET is a C# library and is available under an MIT Open Source Licence 

  2. GeoJSON.NET is a C# library and is available under an MIT Open Source Licence 

  3. Mapbox Unity SDK is a C# library that is available under an MIT Open Source Licence 

  4. geometry3Sharp is a C# library that is available under a Boost Open Source Licence