Publicerad den Lämna en kommentar

Geodesy without math equations: Is that possible?

Geodesy without math equations: Is that possible? The answer is no, but basic geodetic concepts can be described without using complex math equations.

My previous column addressed the geodesy crisis in the United States. The newsletter was highlighted on LinkedIn (thanks, Jay); more than 235 individuals reacted to the post and there were 25 reposts. (See also xxxx)

I’m pleased so many people are interested in highlighting the discussion of the inverted pyramid. One reader of my column asked for material for non-geodesists to obtain a better understanding of geodetic concepts.

Geodesy does involve advanced mathematics that may not be familiar to some people. That said, there are various online lessons and tutorials that describe the basic concepts without using complex math equations.

As mentioned in my previous column, geodesy is involved with anything related to positioning. For example, have you ever wondered how your phone appears to know where you are on a digital map while you’re walking or driving down the street? Geodesy provides the foundation for all geospatial products and services.

Image: Dave Zilkoski

Image: Dave Zilkoski

Location on a Map

A goal of mine has always been to get individuals (young and old) interested in obtaining a better understanding of geodesy. In my opinion, high schools and colleges should include courses that explain to students how their phones know where they are, why the Earth is not a sphere, how the movement of tectonic plates are measured and why, basic concepts of how satellites orbit the earth, and how geographic coordinates are important to making maps and their use in establishing an accurate geographic information system (GIS).

A good first step is to get high school teachers interested in the topic. When I was employed by the National Geodetic Survey (NGS), a group of us worked with local high school students to map their football field using GPS. They acquired observations in the field, and then downloaded the coordinates into their GIS. The teacher was instrumental in integrating the application into the students’ curriculum.

A reader of my last column suggested I provide concrete, meaningful things to lower the barrier of entry. I’m not exactly sure how to lower the barrier of entry — geodesy does require an individual to have a certain level of mathematical knowledge.

Since I retired from NGS, I have helped homeschool my eight grandkids. The one thing that I’ve found is that young students apparently either “like” math or they “hate” math. At least with my grandkids, there doesn’t seem to be an in between.

At this moment, I don’t believe any of my grandkids will become geodesists; well, actually, there’s still a possibility that one may have a “love for mathematics.” It appears that most students don’t really see a reason to learn math. They can use their phones or calculators to do what they need.

The reader suggested that the geodesy community could publish free, high-quality, web-based resources for the public. The reader made the following suggestions:

  • A set of 3D-printable designs for rudimentary survey tools; alternatively, how to acquire/build the tools in the most economical way possible. Even something that would be considered a “toy” that can be given to a child would be good.
  • A list of software tools (preferably open source) relevant to the subject and how to use them in this context.
  • Introductory material intended for young audiences.

This column will provide some free online lessons and tutorials that describe the concepts associated with geodesy and surveying. Some of the online videos are at a level for young audiences, and some are aimed at individuals with more advanced education. Let’s start with the young audience.

Lessons for Kids

The website “Get Kids into Survey” provides materials focused on kids. The website states: “Bringing young people into the exciting world of survey through pioneering content and engaging experiences.” See the boxes titled “Get Kids into Survey Website,” “Get Kids into Survey Website – Poster Page,” and “Get Kids into Survey Website – World Without Surveyors Poster.”

Get Kids into Survey Website

Photo:

Screenshot: Get Kids lnto Survey

Get Kids into Survey Website – Poster Page

Screenshot: Get Kids into Survey

Screenshot: Get Kids into Survey

Get Kids into Survey Website – World Without Surveyors Poster

Screenshot: Get Kids into Survey

Screenshot: Get Kids into Survey

The GPS.gov website has lessons describing GPS that are designed for kids. One lesson introduces the concept of GPS trilateration. The lesson explains how GPS positioning works on two basic mathematical concepts:

  1. trilateration, which literally means positioning from three distances, and
  2. the relationship between distance traveled, rate (speed) of travel, and amount of time spent traveling.

This was developed by NGS for a National Science Teachers Association Conference. You can download both the instructions and map.’

GPS Trilateration Lesson

Photo:

Screenshot: GPS.gov website

The following are several videos that describe the concept of trilateration.

This video =explains trilateration and how the GPS ranges (distances from the satellite to the receiver) are computed.

This video uses distances on a map to describe trilateration.

Here is a detailed description of trilateration and why you need the fourth satellite.

Here is a detailed description of how GPS works.

Now, let’s look at some free online lessons and tutorials that describe the concepts associated with geodesy. As previously stated, some of the online videos are at a level for young audiences, and some are aimed at individuals with more advanced education. Most of them describe the concepts using diagrams with narratives, and without complex math equations. NGS provides a number of videos that can be downloaded here.

NGS, in partnership with the COMET program, has developed a series of self-paced lessons on geodetic and remote sensing topics. Users have to create a free user account to gain access to the courses. Users will have the option of printing out a certificate upon successful completion of a quiz at the end of each lesson.

The lessons are rated by skill level ranging from “Suitable for Non-Scientists” to “Requires some Prior Knowledge of the Topic.”

The COMET program provides teaching and training resources for the geoscience community. All of the content is completely free, but an account does need to be created. The COMET program is part of the University Corporation for Atmospheric Research (UCAR) Community Programs.

NGS Online Lessons

Screenshot: NGS Website

Screenshot: NGS Website

NGS and COMET Educational Videos

NGS also has a website that contains educational videos. Again, NGS, in partnership with the COMET Program, has developed short videos on topics related to geodesy and mapping.

NGS Educational Videos

Screenshot: NGS Website

Screenshot: NGS Website

This link provides a tutorial on “Why is geodesy the framework behind all mapping and navigation?” The article states. “If you think about it, the whole field of geomatics lies on the shoulders of geodesists. Because it’s really geodesy that is the framework behind all surveying, mapping and navigation.”

What Is Geodesy?

Screenshot: Gisgeography Website

Screenshot: Gisgeography Website

NASA’s Eratosthenes Estimating the Circumference of the Earth by Looking Down a Well

NASA offers a video titled “Looking Down a Well: A Brief History of Geodesy.” This video explains how it all started when Eratosthenes estimated the circumference of the Earth by looking down a well. It highlights how, over time, the field of geodesy has expanded and evolved dramatically, and how NASA uses technology such as radio telescopes, ground surveys, and satellites to contribute.

NASA’s Video on Looking Down a Well

Photo:

UNAVCO Measures Plate Tectonics with Geodesy

UNAVCO, a non-profit university-governed consortium, facilitates geoscience research and education using geodesy. UNAVCO has a video that describes the tectonic plates and how geodesists measure their movements. Another UNAVCO video describes what geodesy actually is, as well as geodesy’s application in our everyday lives (UNAVCO’s 2017 USIP geoscience video production). Visit UNAVCO’s website to learn more about its mission.

Geodetic Software Tools

NGS provides tools that focused on meeting the needs of the surveying and mapping community. A few may be of interest to non-geodetic individuals. A map tool can be used to locate marks near someone’s location.

Production NGS Map

Screenshot: NGS Website

Screenshot: NGS Website

UNAVCO also has interactive tools that may be of interest to geospatial users. See the boxes below titled “UNAVCO Interactive Tools” and “UNAVCO Spotlight.”

Screenshot: UNAVCO Website

Screenshot: UNAVCO Website

Screenshot: UNAVCO Website

Screenshot: UNAVCO Website

3D Printer of Surveying Equipment

Now, let’s address the 3D printing of surveying equipment and tools. I’m not familiar with using a 3D printer, but I found several websites that provide information on surveying equipment. Some of the sites provide free information and others charge for their services. See the websites 3D Printer of Total Station and 3D Printer of GNSS Equipment.

3D Printer of Total Station

Screenshot: CULTS Website

Screenshot: CULTS Website

3D Printer of GNSS Equipment

Screenshot: 3dmdb Website

Screenshot: 3dmdb Website

I’m pleased a lot of people are interested in highlighting the discussion of the inverted pyramid. As commented by several individuals in the LinkedIn responses, the surveying and photogrammetry (which includes remote sensing) communities are experiencing the same crisis as geodesy. In my opinion, they are all related, because the surveying and mapping community provides tools other disciplines use.

As stated in my last column, the surveying and mapping community can do the following to help:

  • Actively market geodesy in high schools as a rewarding career for the math stars before college entry.
  • Build back, support and sponsor geodesy programs at select universities. This support needs to be strategic with backing from the highest levels of the U.S. government.
  • Encourage U.S. government support in the form of grants, professional development of staff, and research collaborations/affiliations.

As previously mentioned, a goal of mine has always been to get individuals (young and old) interested in obtaining a better understanding of geodesy. I hope this column helps to whet the appetite of some individuals to obtain a better knowledge of geodesy. Maybe even some high school and college teachers will introduce geodetic concepts in their lectures.

Writing about the geodesy crisis is a good first step, but we need to find champions that can influence high school and university teachers and administrators, federal and state government program managers, and congressional representatives.

Please feel free to email me at geospatialsolutionsbyDBZ@gmail.com if you have suggestions on how to lower the barrier of entry into the world of geodesy.

Publicerad den Lämna en kommentar

FrontierSI to support Australia’s Ginan for LEO satellites

FrontierSI has signed a collaborative agreement with Geoscience Australia, Curtin University and the University of Newcastle to enhance Ginan with features specifically aimed at supporting low-Earth orbit (LEO) satellites as an important component of Geoscience Australia’s Positioning Australia program.

Ginan is Geoscience Australia’s GNSS analysis center software. It delivers a real-time positioning correction service through open-source software and additional positioning products to enable precise point positioning for Australian industry and users.

The design, development and deployment of LEO satellites has grown significantly over the last decade. The agreement with FrontierSI complements ongoing Ginan precise orbit determination (POD) development activities, focusing on the implementation of LEO satellite modeling and the orbit integrator/propagator capabilities needed to enable LEO GNSS data to be processed and high-precision LEO satellite trajectories estimated and predicted.

Such a capability will enable:

  • better monitoring of LEO satellites for station keeping, collision avoidance and end-of-life purposes
  • improved ionosphere and troposphere monitoring and modeling through the analysis of GNSS signal occultation, to provide data for weather prediction and precise positioning purposes.
    Image: GINAN

    Image: Ginan

Learn more about Ginan here.

Publicerad den Lämna en kommentar

CHC Navigation launches Landstar8 data-collection app

Screenshot: CHCNAV

Screenshot: CHCNAV

CHC Navigation has released LandStar8, a field surveying and mapping application for Android devices. LandStar8 is designed to be flexible and user-friendly for surveying and mapping tasks.

LandStar8 is versatile, modular and customizable for topographic tasks such as surveying, stake out, cadastral, mapping and geographic information systems (GIS). Building on the legacy of LandStar7, the new LandStar8 provides features such as a refined user interface, streamlined workflows, faster operation, and integrated cloud services.

“With LandStar8, we want to provide our users with unprecedented field experience,” said Rachel Wang, product manager of CHC Navigation’s Surveying and Engineering Division. “LandStar8’s modular design allows users to customize the interface according to their usage habits, making it easier and more efficient for field crews to work.”

Cloud connectivity is built in, for backup, data storage or remote technical support.

LandStar 8 has a simple and intuitive layout with large map windows and sharp graphics. Users can hide features they rarely use and display only those they need.

On LandStar8, users can copy coordinate settings, control and staking points from another handheld controller by scanning a QR code. Projects can be edited and sorted by history and attributes. Custom coordinate systems, geoid models and coding libraries can be updated at any time by using resource packages. LandStar8 also features a terrain calibration wizard designed specifically for non-expert users.

A proprietary MetaCAD graphics engine opens DWG and DXF base maps faster and with smoother rendering. DXF files up to 200 MB can be opened in less than 10 seconds. LandStar8 also supports opening external reference files, automatically recognizes CAD length units, and allows editing of CAD base maps directly in the field.

LandStar8 is designed around a comprehensive cloud-based architecture that supports project backup, collaborative work and data storage. Its remote support capabilities help the office helpdesk resolve user problems and provide personalized technical assistance. A “share code” feature allows users to transfer project data between desktop computers and field controllers or among field controllers quickly to further boost work efficiency.

Publicerad den Lämna en kommentar

NGA seeks feedback on how to improve Earth modeling

NGA logoThe National Geospatial-Intelligence Agency (NGA) is seeking information from the GNSS community on upgrades to its Stardust program.

Stardust develops models of the Earth used in geomatics. The upgrades will result in modernization of geomatics information technology systems and infrastructure. The update includes migration of models to the cloud.

The NGA posted a request for information (RFI), with responses due by 5 p.m. Eastern Time on Dec. 21.

Stardust is run by the NGA Foundation GEOINT Integrated Program Office, partnered with the Foundation GEOINT Group (NGA/SF) within the Source Operations and Management Directorate.

Publicerad den Lämna en kommentar

Deformation monitoring company NavStar joins Terra Insights

Photo: NavStar

Photo: NavStar

NavStar — a deformation monitoring company — has joined the Terra Insights platform of geotechnical brands.

NavStar develops specialized hardware and software for automated detection of movement on slopes and structures, with an emphasis on GPS/GNSS sensors. It provides a scalable and modular data-collection and presentation software platform.

“NavStar perfectly complements Terra Insights’ vision of being the global platform to provide trusted geotechnical, structural and geospatial monitoring technology and data delivery solutions,” said Mark Price, CEO, Terra Insights. “NavStar’s specialized expertise in automated deformation monitoring systems from both a hardware and software perspective expands Terra Insights’ core capabilities while pushing us further into the future.”

NavStar’s team of surveyors, engineers, technologists and software developers has been providing specialized GPS/GNSS solutions, products and support to clients around the world since 2001.

NavStar’s specialized GeoExplorer and deformation monitoring products are used by the mining, oil and gas, power, construction and government sectors.

“We are excited to join Terra Insights,” said Glen Bjorgan, manager of Field Operations at NavStar. ”Over the years we have worked extensively with the companies that make up the Terra Insights platform. Through that experience, we know that Terra Insights will be a great fit for NavStar and our customers.”

Publicerad den Lämna en kommentar

Editorial Advisory Board Q&A: Improving the GPS program

What works well and what needs improvement in the GPS program regarding technology, policy, or management?

Jules McNeff

Jules McNeff

“GPS technology and operational performance continue to set the standard for GNSS, but necessary modernization is late to need, and becoming later by the day. This reflects what I see as loss of focus on ‘Job 1’ (delivering effective GPS service to the Joint Force) and a diminution in the sense of ‘GPS uniqueness and exceptionalism’ in its management as it was fragmented within the old SMC and is no longer the ‘shiny new object’ within the evolving Space Force. Even so, its value to its global user base, and particularly to U.S. and allied militaries, is stronger than ever and it remains the cornerstone among diverse complements within the Department of Defense PNT Enterprise. It is incumbent on the DOD to ensure the GPS services our warfighters will depend on can sustain that vital role.”

— Jules McNeff
Overlook Systems Technologies


Ellen Hall

Ellen Hall

What works well? There is good focus on the areas that need development: M-code, CRPA, resiliency. What needs improvement? More thorough and timely sharing of information by the government with industry. — Ellen Hall, Spirent Federal Systems


Mitch Narins

Mitch Narins

The ‘GPS program’ has set the standard for all other GNSS efforts, but there are always lessons to be learned. I have full confidence that USSF leadership is well equipped to deal with both the technology and management aspects of the program. As for policy, which supports military and civil uses worldwide, there is a clear distinction, based on mission areas and acceptable risk. However, risks to civil users have increased as GPS PNT services permeate all civil critical infrastructure systems. Therefore, system improvements directed at civil user PNT resilience should be given a higher priority and funded through appropriate civil channels. I encourage a policy to enable more resilient PNT services from space — and to consider that by looking both ‘up’ and ‘down’ for PNT services, unfortunate ‘situations’ might be avoided.
— Mitch Narins,
Strategic Synergies


Bernard Gruber

Bernard Gruber

“One of the most consistent and enduring enablers of the GPS program is national policy. NSPD-39 re-baselined requirements buttressed by GPS being provided to the world for free, that it must be sustained and have an ever-present focus on performance improvement and robustness. Accordingly, NSPD-7 acknowledges an ever-changing world with a nod to cybersecurity, augmentations and direction to “improve NAVWAR capabilities to deny hostile use of United States Government space-based PNT services, without unduly disrupting civil and commercial access to civil PNT services.”
— Bernard Gruber,
Northrop Grumman

Publicerad den Lämna en kommentar

Another Orolia ELT receives Cospas-Sarsat certification

New-generation aircraft ELT meets new European Union Aviation Safety Agency (EASA) and U.S. Federal Aviation Administration (FAA) requirements

Photo: Orolia

Photo: Orolia

Orolia has received certifications for yet another survival emergency locator transmitter (ELT), the Ultima-S.

The news follows Orolia’s announcement that it had received certification for the Ultima-DT model, as well as a personal locator now shipping to the U.S. Army.

The Ultima-S is a new generation ELT installed in either the cabins or liferafts of aircraft. It relays accurate aircraft location information to search-and-rescue teams.

Once activated, a 406-MHz distress signal is transmitted and includes the ELT’s location thanks to the Ultima-S internal GNSS receiver. This built-in GNSS capability increases both probability and speed of detection of the distress signal.

“With these key certifications for the Ultima-S, Orolia brings a long-awaited solution to the industry,” said Jérôme Ramé, Orolia’s Aviation & Military Product Line Director. “We have developed strong partnerships with several of the leading aircraft manufacturers that will enable operators worldwide to benefit from the Ultima-S for both their linefit and retrofit needs, allowing fleet standardization.”

The Ultima-S provides free, global coverage service through the dedicated Cospas-Sarsat infrastructure while meeting the highest aviation safety standards. Orolia offers non-rechargeable lithium batteries compliant with the latest FAA and EASA special conditions standards, also known under TSO-C142b/DO227A. The Ultima-S also meets the most recent ELT performance and environmental standards through TSO-C126c.

“What makes the Ultima-S unique is a new feature called the Return Link Service (RLS),” said Ramé. “Through this capability, the user is automatically notified when the distress signal is detected and located by the Cospas-Sarsat ground infrastructure. The Ultima-S links directly to the European Galileo GNSS satellite constellation, providing the most reliable and timely information for reaching aircraft crew members in distress.”

In addition to being available on a linefit basis on major aircraft programs, Orolia has launched an exchange program to make retrofit activities easier for airlines, especially those upgrading to safer battery technology.

Publicerad den Lämna en kommentar

Space Force orders 3 more GPS IIIF satellites from Lockheed

The three new GPS satellites will be delivered under the third production option of the GPS III contract

Space Systems Command (SSC), a division of the U.S. Space Force, has exercised its third production option valued at $744 million for the procurement of three additional GPS III Follow-On satellites from Lockheed Martin.

The contract option covers GPS IIIF Space Vehicles (SVs) 18, 19 and 20.

GPS IIIF will provide several next-generation capabilities to meet increased demands of both military and civilian users. Building on the technical baseline of satellites 01 to 10, the newer satellites will provide increased anti-jam capabilities for the military with the addition of a Regional Military Protection capability.

Precision ranging measurements will be enabled by a laser retro-reflector array and will address the consolidation of telemetry, tracking and commanding frequencies.

Additionally, GPS IIIF leverages major international collaboration with the Canadian Department of National Defense and other U.S. government organizations such as the National Oceanic and Atmospheric Administration, the Air Force Rescue Coordination Center, and the U.S. Coast Guard Office of Search and Rescue (SAR) by hosting a new SAR payload.

This payload provides enhanced capabilities to the SAR mission with distress alert detection and location to 100 percent continuous global coverage and reduces location uncertainty to less than 5 km in support of 49 international partners.

Finally, the program will host a redesigned Nuclear Detonation Detection System that has a lower overall size, weight and power requirement.

“Along with our industry and government partners, the GPS IIIF team continues to add world-class capabilities that underpin U.S. national security needs to both our warfighters and civil users across the globe as the most utilized United States Space Force capability,” said Col. Jung Ha, GPS Space Vehicles senior materiel leader for SSC Military Communication and Positioning, Navigation and Timing.

The GPS IIIF SV11-12 satellites were included in the original GPS IIIF contract awarded to Lockheed Martin in September 2018 to build up to 22 GPS IIIF satellites. Under that contract, SSC exercised the first production option for SV13-14 in October 2020 and second production option for SV 15-17 in October 2021.

GPS IIIF’s M-Code can be broadcast from a high-gain directional antenna in a concentrated, high-powered spot beam, in addition to a wide-angle, full-Earth antenna. (Artist rendering: Lockheed Martin)

Artist’s rendering of a GPS III satellite. (Image: Lockheed Martin)

About Space Systems Command

Space Systems Command is the U.S. Space Force field command responsible for rapidly identifying, prototyping and fielding resilient space capabilities for joint warfighters.

SSC delivers sustainable joint space warfighting capabilities to defend the nation and its allies while disrupting adversaries in the contested space domain. SSC mission areas include launch acquisition and operations; space domain awareness; positioning, navigation, and timing; missile warning; satellite communication; and cross-mission ground, command and control and data.

Publicerad den Lämna en kommentar

The role of atomic clocks in data centers

How the atom went from data’s worst enemy to its best friend

By David Chandler, product marketing manager, Frequency and Timing Systems business unit, Microchip Technology

GNSS constellations are precise timing systems. (Image: Microchip Technology)

GNSS constellations are precise timing systems. (Image: Microchip Technology)

Timing from atomic clocks is now an integral part of data-center operations. The atomic clock time transmitted via Global Position System (GPS) and other Global Navigation Satellite System (GNSS) networks is synchronizing servers across the globe, and atomic clocks are deployed in individual data centers to preserve synchronization when the transmitted time is not available. 

This high level of synchronization is vital to ensure the zettabytes of data collected around the globe every year can be meaningfully stored and used in many applications, whether due to system requirements or to ensure regulatory compliance. The quantum nature of an atom enables the precision time and is a critical part of ensuring that more data at faster speeds will be processed in the future — ironic, as just a few years ago the quantum nature of the atom was seen as the ultimate death of this increase in data processing and speed. 

In 1965, Gordon Moore predicted the transistor count on an integrated circuit would double every year. This was eventually revised to doubling every two years. Along with this increase in transistor density came an important increase in speed as well as decreases in cost and power consumption. 

It may have been hard in 1965 to imagine there would be any real-world need to have a semiconductor with 50 billion transistors on it in 2021, but as semiconductor technologies kept up with the law, so did application demands. Cell phones, financial trading and DNA mapping are all applications that rely heavily on the number of operations per second a microprocessor can execute, which is closely tied to the transistor count on a chip. 

Satirical image of an engineer trying to keep up with Moore’s Law. (Image: Microchip Technology)

Satirical image of an engineer trying to keep up with Moore’s Law. (Image: Microchip Technology)

The Demise of Moore’s Law

Unfortunately, Moore’s Law is rapidly coming to an end due to a limit imposed by physics. With wafer fabrication now in the sub-10-nm technology nodes, the transistor sizes are only about 10 to 50 times that of a silicon atom. At this scale, the size and quantum properties of atoms and free electrons significantly prohibit further size reduction. In essence, you could think of the atom as the ultimate court that struck down the law. 

But while Moore’s Law will come to an end, the thirst for increased processing power will continue to grow. With the advent of the internet of things (IoT), streaming services, social media posts and autonomous self-driving cars, the amount of data generated every day continues to increase exponentially. 

In 2021, every day an estimated 2.5 exabytes (2,882,303,761,517,120,000 bytes) was generated. Exabyte databases managing more than 100,000 transactions per second (a transaction consists of multiple operations) are currently in use, and the size of the databases and the transactions per second will continue to grow for the foreseeable future.

Synchronizing the Machines

This explosive growth in the volume of data — coupled with the speed at which the data must be written, read, copied, analyzed, manipulated and backed up — required data-center architects to find a way around the end of Moore’s Law. The architects employed horizontal scaling in a data center with distributed databases, where instead of an entire database residing on one server, the database is distributed over multiple servers in a cluster. 

In this configuration, the cluster essentially functions as one giant machine, hence the size and speed of the system now becomes limited by the physical size of a data center rather than by the size of an atom. (Take that, atom!)

Software engineers now make careers writing code that enables horizontal scaling. For all the software to work, however, all the machines must be synchronized. Otherwise it violates a concept called causality. 

What is causality? It is easiest to explain through an example. Suppose you have two cameras to record images for a 100-meter dash, each with its own internal clock. The first camera is at the starting blocks. The second camera is at the finish line. Both sensors are continually firing and timestamping each image with the time from their respective clocks. 

Photo:Clock uncertainty causes issues with causality. In this case, a race officially finished before it started. (Image: Microchip Technology)

Clock uncertainty causes issues with causality. In this case, a race officially finished before it started. (Image: Microchip Technology)

To determine the official time of the winning sprinter in the race, the first camera’s images are reviewed for the point in time when the first runner left the block and this time-stamp is subtracted from the time-stamp on the last camera’s image for that runner crossing the finish line. 

For this to work, both cameras must be synchronized to an acceptable level of uncertainty. If the synchronization of the clocks is only ±0.05 seconds, you would be unable to determine if someone who was recorded as running 9.6 seconds actually broke the world record of 9.58 seconds. What if they were only synchronized to ±5 seconds from the stadium clock? 

Imagine this scenario: Observed from the main stadium clock, a race starts at exactly 12:00:00:00 p.m. The first runner crosses the finish line at 12:00:09:60 p.m. From the perspective of the main stadium clock, the official race time was 9.6 seconds. 

But what if the first camera’s clock was exactly 5 seconds fast and the second camera’s clock was exactly 5 seconds slow? The race would officially start at 12:00:05:00 p.m and finish at 12:00:04:60 p.m. The race would officially finish 0.4 seconds before it started, the world record would be shattered, the laws of physics would be broken, and the current record holder would most likely be wrongfully dropped by all his sponsors. 

Applying Causality to a Database

The same principle of causality is important in a database. Transactional record updates must appear in the database in the sequential order in which they occurred. If you count on the direct deposit of your paycheck arriving prior to having a direct withdrawal to pay your monthly mortgage, and the bank’s database did not record these in the correct sequence, you will be charged an overdraft fee. On one machine, causality errors are easy to prevent, but on multiple servers, each with its own internal clock, the servers must be synchronized and timestamp every transaction.

To achieve this, one server must act as a reference clock, much like the stadium clock, and it must distribute time to each server in a way that minimizes the time error of each server clock. The uncertainty of each timestamp (±5 seconds in the race) forms a time envelope that is twice the uncertainty of the clock (10 seconds for the race). For a distributed database, the number of nonoverlapping time-envelopes that can fit into a second should be at least on the order of the number of transactions per second expected for the system. 

Probability, criticality of causality, and cost of implementation will ultimately all play a role in the final solution, but this relationship is a good starting point. A system with time-stamp uncertainties of ±1 millisecond would have time-envelopes of 2 milliseconds, and a maximum of 500 non-overlapping time-envelopes would fit in one second. This system could support approximately 500 transactions per second. 

Where NTP and PTP Fall Short

Time-over-Ethernet technologies known as Network Time Protocol (NTP) and Precision Time Protocol (PTP) are used to synchronize all the servers in a distributed database in a data center. These protocols can ensure a local area network can distribute time with sub-millisecond (NTP) or sub-microsecond (PTP) uncertainties, enabling thousands (NTP) or millions (PTP) of transactions per second.

Unfortunately, even with these solutions that enabled a detour around the atom-imposed demise of Moore’s Law, physics has thrown another roadblock in the path of distributed databases in the form of the speed of light. 

Imagine a well-synchronized distributed database operating with PTP in San Jose, California, happily executing 100,000 transactions per second with no causality issues. One of the database architects is sitting in his office in New York and his boss asks him to update a large series of records. 

The architect wants to be able to exploit his new database to its full extent and show off the system capabilities. He plans on executing 100,000 transactions per second. 

To update records per the request, he creates a simple transaction that adds the value of one record to a second record only if the value of the first record is greater than the second record. To accomplish this, he must issue a read to both records. His local machine in New York will then compare the values, then send a write command to the second record when needed.

After completing this, he then wants to execute the next transaction that compares a third value to the new sum. If the new sum is greater than the third record, then the third record is replaced with the sum. He wants to repeat this for 6 million records. Because the database is capable of 100,000 transactions per second, he thinks it will be done in roughly a minute. He tells his boss he will have the records updated in five minutes, then leaves to get a cup of coffee. 

While drinking his coffee, he reads a story about how the new 100-meter dash record is negative 0.4 seconds which defies the laws of physics, and that the previous record holder is suing the stadium officials because he has lost all his endorsement money. The architect laughs to himself and thinks that the stadium should have hired him as the synchronization expert.

He comes back to his desk five minutes later and is dismayed to see that his database update has completed fewer than 1,500 transactions. He sadly realizes his mistake and prepares his résumé to send it over to the stadium, where he hopes his PTP deployment won’t have the same problem. 

What went wrong? The speed of light limits the theoretical fastest possible transmission of data between New York and San Jose to 13.7 milliseconds. 

The speed of light imposes a theoretical limit to the speed at which data can be transferred between two points. (Image: Microchip Technology)

The speed of light imposes a theoretical limit to the speed at which data can be transferred between two points. (Image: Microchip Technology)

The Distance Problem

Unfortunately, real world transactions are even slower. Even with a dedicated fiber-optic link between the two locations, the refractive index of the fiber, the real-world path of the fiber and other system issues make this transit time even slower. So just one transmission from New York will take 40 to 50 milliseconds to arrive in San Jose. 

However, in this transaction there are four unique operations. There are two read operations, which could happen in parallel, which then have to be sent back to New York. The round trip takes 80 to 100 milliseconds. Then, once both values are compared, a write operation is issued and a write acknowledgement must be sent back indicating the write operation completed before the next transaction can start. 

Suddenly, it doesn’t matter that the database can perform 100,000 transaction per second, because the distance is limiting the system to 5 transactions per second. To complete the 6 million transactions, this system would take 13 days, more than enough time for several more cups of coffee and to update a résumé. This delay is referred to as communications latency.

Circumventing Latency 

But just like with Moore’s Law, database architects figured out how to circumvent latency. Database replications are created near the users, so they can work with the data without having to send signals across the country. 

Periodically, the replications are compared and reconciled to ensure consistency. During the reconciliation process, the transaction time-stamps are used to determine the actual sequence of transactions, and records are sometimes rolled back when there is an irreconcilable difference such as when the transaction time-envelopes overlap. Reducing clock uncertainty reduces the number of irreconcilable differences in replicated instances, as more time-envelopes reduce the probability of overlaps. This results in higher efficiencies and lower probabilities of data corruptions. 

But now the timestamping has to be accurate not only within each data center, but also between the data centers, which can be separated by thousands of miles and connected via the cloud. This is a much more difficult task, as it requires an external reference with very low uncertainly that is readily available in both locations.

Down to the Atomic Level

Enter the previous foe of the database architect, the atom. While the atom was busy repealing Moore’s Law, its subatomic particles were busy spinning. The neutrons and protons in the nucleus were rotating, while at the same time the electrons were busy orbiting about the nucleus, while also spinning on their own axes. This is analogous to Earth orbiting around the sun while simultaneously spinning on its axis. 

The electrons can spin around their axes clockwise or counterclockwise. Considering there are roughly 7 octillion (7 with 27 zeros after it) atoms in a human, with all the subatomic particles spinning in our bodies, it is amazing we aren’t permanently dizzy. (Note: The subatomic particles aren’t really busy spinning and orbiting, they are really busy giving us probability wave functions and magnetic interactions that would give us results similar to what would happen if they were spinning and orbiting. But if the thought of all the spinning makes you dizzy, trying to comprehend the reality of quantum mechanics will make you positively nauseous.)

Conceptual atoms with nucleus and valence electron with nuclear spin, electron spin and orbital spin. (Image: Microchip Technology)

Conceptual atoms with nucleus and valence electron with nuclear spin, electron spin and orbital spin. (Image: Microchip Technology)

When microwave radiation at a very specific precise frequency is absorbed by an electron, the direction of spin about the electron axis can be changed. If this happened to Earth, the Sun would suddenly set in the east and rise in the west! 

Atomic clocks are machines designed to detect the state of the electron spin, and then change that direction through microwave radiation. The frequency varies depending on the element, the isotope, and the excitation state of the electrons. 

Once the machine determines the frequency, known as the hyperfine transition frequency, the period can be determined as the inverse of the frequency, and the number of periods can be counted to determine the elapsed time. The international definition of the second is 9,192,631,770 periods of the radiation required to induce the hyperfine transition of an electron in the outer orbital shell of a cesium atom.  

Atomic clocks are the most stable commercially available clocks in the world. An atomic clock the size of a deck of cards called the chip-scale atomic clock (CSAC) will drift 1 millionth of a second in 24 hours, whereas an atomic clock the size of a refrigerator called a hydrogen maser will only drift 10 trillionths of a second in 24 hours. (Coincidentally, 10 trillionths is also about the ratio of the radius of the hydrogen atom to the height of the sprinters in the 100-meter dash and of the now-unemployed data-center architect in New York.)

With the accuracy provided by these atomic clocks, approximately 500,000 to ~50 billion non-overlapping time-envelopes can be provided for a distributed database running in data centers in Tokyo, London, New York, Timbuktu or anywhere else in the world.

The unit second is defined by counting 9,192,631,770 cycles of the cesium hyperfine transmission radiation frequency. (Image: Microchip Technology)

The unit second is defined by counting 9,192,631,770 cycles of the cesium hyperfine transmission radiation frequency. (Image: Microchip Technology)

Time for Distribution

How does time get to all the data centers from these atomic clocks? Universal Coordinated Time (UTC) is a global time distributed by satellites, fiber optic networks, and even the internet. UTC itself is derived from a collection of high precision atomic clocks located in national laboratories and timing stations around the world. Contributors to UTC receive a report that provides the UTC time from these clocks and their individual offset from calculated UTC. The labs and other facilities then transmit the time to the world. 

The UTC report is published monthly and tells the national labs their miniscule timing offset from UTC during the previous month. Technically, we don’t know precisely what time it was up until a month after the fact. And to make things worse, extra seconds are periodically added to UTC, called leap seconds, which are inserted due to variations in the Earth’s rotation and our relative position to observable stars. While this aligns Earth to the universe, it causes havoc in data centers and 100-meter dashes. 

The hyperfine transition frequency produced in a hydrogen maser, 1.420405751 GHz, will cause spin reversal in an electron. (Image: Microchip Technology)

The hyperfine transition frequency produced in a hydrogen maser, 1.420405751 GHz, will cause spin reversal in an electron. (Image: Microchip Technology)

Enter GNSS

Two common methods used by data centers to acquire UTC are via the internet using publicly available NTP time servers and via satellite using GPS or other GNSS networks. While timing through public NTP timeservers over the internet was common during early deployment of distributed databases, inherent performance, traceability and security issues have created the push to move away from this solution. 

Even though GPS and other GNSS are typically thought of as positioning and navigation systems, they really are precision timing systems. Position and time at a receiver are determined by the transit time of signals traveling at the speed of light from multiple satellites to the receiver. Ironically, this is another case of a physics principle causing a problem — in this case the speed of light instead of the atom — but also contributing to the solution. 

The satellites have their own onboard atomic clocks, which are synchronized to UTC that was transmitted to the satellites from ground stations. Acquiring UTC with this method can provide time uncertainties in the 5-nanosecond range, enabling 100 million time-envelopes per second. 

This method is far more reliable and accurate than public NTP servers, and while these signals can be interrupted by such events as solar storms or intentional signal jamming, backup clocks that have been synchronized to the satellite signals when present can be placed in each individual data center to provide the desired uncertainty levels during these interruptions.

The evolution of database transaction rates and the enabling and disabling technologies. (Image: Microchip Technology)

The evolution of database transaction rates and the enabling and disabling technologies. (Image: Microchip Technology)

Next Up: Jumping Electrons

As our quest to acquire, store and transact data in the future continues to grow, novel atomic-clock technologies and time transmission systems with lower uncertainties will be needed. Currently, national timing labs are developing atomic clocks that work on the optical transitions that occur when an electron jumps orbital shells. These offer frequency stabilities to a quintillionth of a Hertz and will eventually be used to redefine the unit second.

Signal transmission through dedicated fiber-optic links or airborne lasers are already yielding improved transmission accuracy. With these continued innovations data, the atom and light will continue their complex love-hate relationship to enable ever larger quantities of data processed at ever increasing rates without consistency issues or causality casualties. 

Publicerad den Lämna en kommentar

Seen & Heard: Finding Nemo, weighing bears

“Seen & Heard” is a monthly feature of GPS World magazine, traveling the world to capture interesting and unusual news stories involving the GNSS/PNT industry.


Photo: Alexey_Seafarer/iStock/Getty Images Plus

Photo: Alexey_Seafarer/iStock/Getty Images Plus

HOW BIG IS THAT BEAR?

Monitoring the weight of polar bears — an important health factor — usually means tranquilizing them from the air and lifting them with a tripod attached to a scale. However, technology might provide a non-invasive solution. Various zoos and sanctuaries are testing the accuracy of lidar scanners to measure the weight of polar bears, reports Geo Week News. The scans could be done using drones and mobile mapping equipment and techniques, according to Joel Cusick, a GIS specialist for the National Parks Service.


Photo: PaulFleet/iStock/ Getty Images Plus

Photo: PaulFleet/iStock/ Getty Images Plus

SLIP SLIDING AWAY

Researchers used a combination of GNSS and interferometric synthetic aperture radar (InSAR) data from Sentinel-1 satellites to determine subsidence in
99 cities around the world between 2015 and 2020. Subsidence rates in Tianjin, Semarang and Jakarta exceed 30 mm per year. Even in mostly stable cities, areas are sinking faster than sea level is rising, with Istanbul, Lagos, Taipei, Mumbai, Auckland and Tampa sinking faster than 2 mm per year in some areas. Besides climate change, causes include groundwater extraction, mining, reclamation of natural wetlands, infrastructure projects and ecological disturbances. The study is published in Geophysical Research Letters.


Photo: NOAA Fisheries/Raymond BolandPhoto:

Photo: NOAA Fisheries/Raymond BolandPhoto:

FINDING NEMO

National Oceanic and Atmospheric Administration (NOAA) ocean mapping ship Rainier completed a five-month expedition to the Mariana Islands in September, combining mapping and charting with coral-reef ecosystem surveying. Collection of high-resolution mapping data in near real time improved the effectiveness of the traditional marine science data collection as the combined team mapped 4,000 square nautical miles of seabed and conducted 1,800 SCUBA dives. The data will improve navigation safety through updated NOAA nautical charts and increase understanding of coral reefs through the National Coral Reef Monitoring Program. Besides charts, the seabed mapping data supports marine protected areas, sustainable fisheries, and offshore wind siting — and, in the Marianas, is important for tsunami modeling.


Photo: mikulas1/iStock/Getty Images Plus

Photo: mikulas1/iStock/Getty Images Plus

GRAVITY DOWN UNDER

An airborne gravity sensor is flying above 80,000 square kilometers of New South Wales (NSW), Australia, collecting data that will improve the accuracy of real-world heights from GNSS positioning to just a few centimeters. Data for the 18-month NSW Gravity Model project will be captured in five stages, starting in Western NSW. The resulting model is expected to enable better resource management, infrastructure planning and natural hazard preparation. It is also a critical building block for developing digital twins, replacing datasets that predate GNSS positioning.