Every year Consult Hyperion publishes our Live 5. We try to shine a lens on the year ahead and think about what will be impacting our clients. The themes for 2021 are:

Today I want to explore the topic of micro location from the point of view of (mostly) Apple ecosystem, and how developers can leverage application programming interfaces (APIs) to build useful apps. In order to understand that, first we should visit the topic of location in general – how do devices know where they are?

Macro-location

Macro-location is accomplished through a combination of satellite (GPS, Glonass, Gallileo and BeiDou), WiFi and mobile tower triangulation; and Bluetooth low energy (BLE) and accelerometry. These approaches were until recently, the best available, but suffered from a number of drawbacks:

  • Satellite based approaches are power hungry and may be slow to acquire signal and suffer from signal reflections in built up areas as well as being largely unusable indoors. Anyone using a mapping service on their mobile phone in the middle of a large city will be familiar with the issue. In addition, satellite-based location has an accuracy of ~3m at best.
  • WIFI based macro location requires at a minimum that the user logs into the access point and cellular triangulation requires device support and a minimum of three towers in range, and likewise suffers from signal reflection.
  • BLE was not really designed for ranging the distance between the devices as the only way to achieve it is using received power as a way of establishing the distance. This in itself, presents limitations – requirement for consistent transmitter/receiver calibration, lack of interference from ‘bags of water’ (humans) being between the devices etc. In our testing we found that the distance measurements are not very accurate (± 0.75m) or stable (we have seen estimated distance between two stationary devices vary by as much as 15%). This is one of the reasons that exposure tracking (e.g. for COVID-19) presented significant challenges not just for outside developers of the tracing apps but also for Google and Apple who developed the main frameworks underpinning many countries Exposure notification apps.

On top of the technology difficulties there are also significant privacy considerations with many unscrupulous app vendors abusing location data in a variety of ways. This has prompted Apple (and to a lesser extent Google) to start restricting access to what they feel are sensitive APIs and cause apps to surface attempts for apps to gain access to e.g. location data.

Where does Ultra Wide Band come in?

Originally devised as a short-range device to device communication otherwise known as ‘wireless USB’ around the same time as WiFi and Bluetooth, it languished while WiFi and Bluetooth got progressively faster and more capable. In effect the consumer end of UWB was not competitive with WiFi in terms of either range or data throughput, and BLE took over the low power and data rates. However, UWB found niche uses in ground penetrating radar and as an automotive collision-avoidance radar.

How is UWB different from e.g. BLE?

One significant difference between UWB and other wireless technologies is that more conventional systems transmit information by varying the power level, frequency, and/or phase of a sinusoidal wave. In contrast, UWB transmits information by generating pulses of energy allowing for pulse-position (and hence time) modulation. This allows an UWB radio system to calculate the ‘time of flight’ of the transmission helping to overcome the multipath propagation (or reflections) and allows for very fine-grained distance of around 20cm 50ns after ranging started – improving to ~3cm within 100ns. It also allows the radios to determine the angles between the radios.

Where do I get it?

Currently there are a number of devices that have UWB hardware including: iPhone 11 family (3 devices), iPhone 12 family (4 devices) Apple watch 6, Homepod Mini (However currently there is no programmer access to the UWB in Watch OS or the Homepod).

Samsung Galaxy Note20 Ultra, Galaxy Flip 2. Additionally, Xiaomi, Oppo and Vivo are rumoured to include the technology on their flagship devices. There are also a number of prototyping chips and boards are available:

  • NXP NCJ29d5
  • SR100T
  • Qorvo DW1000
  • DW3000

The above development boards are using the HRP standard, whereas 3dB 3DB8630 uses the LRP standard (ISO/IEC 24730-61 LRP).

 What can we do with UWB?

Apple have an internal framework that is used in support to the ‘Airdrop’ feature of iPhones and as part of the Car Key. These APIs are not available to the developer, but as happened with NFC, I expect some expansion of capability over time (for reference NFC APIs were completely unavailable for a couple of years apart from Apple Pay). Over time they have opened them up somewhat although they still restrict access to payment-related application identifiers).

Hardware is well and good, but you need software to make it do useful things, and whilst it is still early days, Apple offers privacy protecting API (NearbyInteraction). This API is for building micro-location based software on iOS and currently, there are some signs of an API being included in the next version of Android (12) although this has not been finalised. Samsung have developed some of the APIs for use in their Internet of things – ‘SmartThings’ products. These APIs have been merged into the Android Open Source Project (AOSP) as ‘UwbManager’ class. Hopefully, as UWB hardware becomes more common the incentive to develop the APIs will hasten the availability of access for developers.

Apple approach

Although UWB was initially developed as a communication ‘channel’, Apple has decided to offer its initial APIs only for ranging of distance and direction. This results in the need for another communication channel to be present (e.g. WiFi or BLE). Additional restriction is that the app using the framework must be in the foreground in order for the UWB session to stay active. This complicates the development somewhat but is not insurmountable. The sample that Apple shared has these as typical steps:

1. Verify that the device has UWB support

2. Start UWB session

3. Get 1-time UWB session token

4. Start another communication channel e.g. multi-peer (Bonjour) preferably encrypted

5. When another device is available over multi-peer then share the token

6. Wait to see if another device has a token to share, and if so then the nearby interaction begins.

7. Distance/angle between devices trigger updates which are available on both sides of the interaction – this is when we can use it ‘real work’

8. At some point the communication between the peers will be interrupted, and if the UWB session is closed then a new one will need to be set up (with different tokens)

We have played with the framework and developed a proof of concept of how we could run a transit-type transaction with one device acting as the gate, and the other side acting as the ticket. In the demo, the two devices find each other, and once the distance between them is reduced to a pre-set threshold, then the transaction data is passed between the devices as if running a standard EMV transaction with the ‘gate’ acting as a terminal and the ‘ticket’ acting as the card. Once the transaction is completed, the ticket becomes invalid. The demo-flow is shown in Figure 1.




Figure 1. Demo app steps.

UWB framework performance

The performance of the UWB framework appears to be good. The updates of direction are dependent on having established a distance. But not vice-versa. Distance values are represented as floats and measured in metres, the direction values are represented as a 3-d float vector with direction x, y, z and measured in radians.

The coordinate system is relative to the centre of the device’s body. As the user looks at the device’s screen:

  • The x-axis extends positively to the right.
  • The y-axis extends positively upward.
  • The z-axis extends negatively starting at the device and moving away from the user.

The Figure 2 illustrates the direction-vector coordinate space from two different angles.

Figure 2. Direction axes definitions.

The half sphere indicates that the z value is constrained to [0..-1]. Due to the U1 chip’s line of sight, the direction vector supports only a subset of the half sphere’s surface area. Click here for more information.

Neither device appeared to have significantly better sensitivity or range (consistent with the same chip being used).

Distance

Testing has shown that we can reliably detect distance changes between two UWB enabled iOS devices (iPhone 11 and an iPhone 12 pro max). Changing relative positions by around 2cm triggers an update more than 90% of the time, a change in position of 3cm or more triggers an update 100% of the time. Positioning a human between the devices without concomitant distance change does not appear to trigger an update. So, this is clearly a more accurate solution than BLE.

Direction

The directional updates are possible within a relatively narrow field of view – laterally ± 30°, vertically approximately 11° up to 22° down. The measurements were made at home with limited tools so should be used as estimates only.

Conclusions

So, what have we learned?

The UWB can offer accurate distance and direction ranging for supported devices. If the technology was available in majority of devices in 2019, then exposure notification systems would have been much more reliable. Android implementations are nascent, but more devices will be entering the market and hopefully, the APIs become available in the future. Current APIs from Apple are useable, but not very flexible, and inter-platform compatibility remains an open question. The technology shows significant promise, and the next iteration from the major players should be followed. Apple have joined the Car Connectivity Consortium but have not joined either of the UWB Alliance nor the FiRa Consortium, so this might show that either they are not interested or that they want to go their own way.

Our in-house development team, Hyperlab, spend a lot of time exploring new technologies and developing prototypes. Knowledgeable and enthusiastic about the prospect of testing new hardware and software features, even before they are fully documented, Hyperlab understand the potential uses and limitations. And our robots test the interactions between readers’ PCDs (readers) and PICCs (cards, phones, wearables) at submillimetre accuracy. If you’d like to understand more about our work in Hyperlab, we’d love to hear from you info@chyp.com.

Leave a Reply

Discover more from Consult Hyperion

Subscribe now to keep reading and get access to the full archive.

Continue reading


Subscribe to our newsletter

You have successfully subscribed to the newsletter

There was an error while trying to send your request. Please try again.

By accepting the Terms, you consent to Consult Hyperion communicating with you regarding our events, reports and services through our regular newsletter. You can unsubscribe anytime through our newsletters or by emailing us.