WEBVTT

00:00.000 --> 00:12.200
Hello, and please welcome Sid, that is going to talk to us about Lighthouse Pottinging

00:12.200 --> 00:16.560
System, something we have been using a lot, a bit crazy, and that has been doing awesome

00:16.560 --> 00:17.560
job.

00:17.560 --> 00:21.800
About no pressure again, so that is going to be great.

00:21.800 --> 00:22.800
Welcome, Sid.

00:22.800 --> 00:23.800
Thank you very much.

00:23.800 --> 00:32.280
You know, I have to say there is way more people than I expected, so thank you for your

00:32.280 --> 00:33.280
interest.

00:33.280 --> 00:35.880
I hope the presentation is really good.

00:35.880 --> 00:43.560
So, as alongside my name is Alvarado, I am a PhD student at the EREA Paris on Swarm Robotics

00:43.560 --> 00:49.360
Facialization, and I spent the last two years of my life trying to coax a VR gaming headset

00:49.360 --> 00:53.080
to do things it was never meant to do, and I am here to tell you all about it.

00:53.080 --> 01:04.560
We have a question on my laboratory, you know, VR gaming headsets are really good

01:04.560 --> 01:05.560
at the localization.

01:05.560 --> 01:16.000
They are very fast, very responsive, really high precision, so can we do that, but for robots?

01:16.000 --> 01:21.640
It is also the answer is yes, and not only yes, yes, it works surprisingly well, like

01:21.640 --> 01:25.800
really high precision, really high accuracy, very low cost.

01:25.800 --> 01:31.200
So, I hope that in this talk, I can give you a general overview and introduction to this

01:31.200 --> 01:36.760
technology, how it works, what are the core concepts that you need to know to both understand

01:36.760 --> 01:41.600
it and you see it, show you some of the open source tools that exist, and hopefully give

01:41.600 --> 01:45.600
you a general idea of whether or not it is a good fit for what are the projects you want

01:45.600 --> 01:48.320
to embark.

01:48.320 --> 01:50.800
So, what is a lighthouse?

01:50.800 --> 01:59.640
So, a lighthouse is a little box over there, it is used for localization on the valve index

01:59.640 --> 02:04.680
and the HTC Vive VR gaming headset, those are a little boxes that you put in the corners

02:04.680 --> 02:08.920
of rooms, and they are sort of emitting laser pulses.

02:08.920 --> 02:15.560
Now, the helmets and the handhelds have little sensors that they receive this light, and

02:15.560 --> 02:19.880
that they can compute their own position.

02:19.880 --> 02:22.480
Now, why will we want to do this?

02:22.480 --> 02:23.480
Are there alternatives?

02:23.480 --> 02:25.280
Yes, yes, there are many alternatives.

02:25.280 --> 02:29.560
In fact, if you want to use the best of the best, you will use a camera-based motion capture

02:29.560 --> 02:32.400
system like a icon or a quality system.

02:32.400 --> 02:37.880
They are absolutely amazing, insanely high refresh rate, super limited precision, but they

02:37.880 --> 02:42.240
have one caveat, they are a centralized system, you have a bunch of cameras, they are

02:42.280 --> 02:46.040
recording your robots, they send out the information to your computer, your computer,

02:46.040 --> 02:50.400
crunches numbers, and then your computers know where the robots are.

02:50.400 --> 02:54.540
The robots don't know where they are until you tell them, which is fine if you have like

02:54.540 --> 02:58.800
a dozen robots, you can just send it to WIFE or something, but the moment you start dealing

02:58.800 --> 03:04.120
with swarms of a hundred or up to a thousand robots, this becomes a very serious problem

03:04.120 --> 03:05.920
on the bandwidth of your system.

03:05.920 --> 03:08.880
If you are into any conference, do you know how the Wi-Fi works when you have more than

03:08.880 --> 03:14.760
fifty percent in a room, try a thousand robots, just does not work.

03:14.760 --> 03:18.640
Also, they are a great use of expensive, we have one in my lab, I believe it costs like

03:18.640 --> 03:22.880
40k euros, which is fine, it's a very good technology, but it's not appropriate for every

03:22.880 --> 03:24.320
single scenario.

03:24.320 --> 03:29.920
On the other hand, the light-up-use-in-system has a fairly decent update rate, five millimeters

03:29.920 --> 03:34.120
is good enough for a robot like this size, or that's enough.

03:34.120 --> 03:38.560
But more importantly, the moment the laser hits the sensor, the sensor has all the

03:38.560 --> 03:44.480
information it needs to compute its own position, thus the position is no by the robots.

03:44.480 --> 03:48.520
Then if the robot wants to tell you the sensor controller, it can do so, but you don't need

03:48.520 --> 03:53.680
to have like a hundred update Hertz rate on the transmitter just to get like a close-loop

03:53.680 --> 03:54.680
control working.

03:54.680 --> 04:01.320
It is also fairly expensive because it's based on very available hardware.

04:01.320 --> 04:04.320
So let's go into the details.

04:04.880 --> 04:09.480
This system is based on two different parts, we have the base stations which are little boxes.

04:09.480 --> 04:14.880
They have a bunch of little motors, lasers and mirrors that spin around and project a spinning

04:14.880 --> 04:17.360
pattern of lasers around it.

04:17.360 --> 04:24.280
It costs about 160 dollars, if you buy it on a steam, they sell it there, and you have the

04:24.280 --> 04:25.280
sensors.

04:25.280 --> 04:30.120
The sensors are these little things that actually receive the light, they are based on a custom

04:30.160 --> 04:35.680
made, see those call it TS for it to 3-1, which has a really fun story, they were

04:35.680 --> 04:40.120
custom made for this and this application only by a US company called Triacemic

04:40.120 --> 04:44.680
Conductors, and for some reason, which I don't know, they just decided to sell them.

04:44.680 --> 04:48.520
So you can just buy them a diggy key, they are there with addition and everything.

04:48.520 --> 04:52.920
Sadly, the light house does not have a data sheet, it's not open source, so thus

04:52.920 --> 04:59.680
why the technology is not super wide spread, even though the parts are actually available.

04:59.680 --> 05:04.240
Now, there is actually two types of light house version, one on version, two, this whole

05:04.240 --> 05:06.680
presentation is on version two.

05:06.680 --> 05:11.360
The difference is on the beam pattern, the light house one has two motors and those

05:11.360 --> 05:16.880
a sweeping horizontal and vertical laser pattern, and the light house B2, they actually

05:16.880 --> 05:22.240
manage to fit the exact same horizontal vertical information on a single motor, so they use

05:22.240 --> 05:25.840
just like a single spinning motor and two V-shaped planes.

05:25.840 --> 05:29.080
From this moment onwards, if I'm talking about light house, I'm referring to the V2,

05:29.080 --> 05:31.080
that's why we are going to be using.

05:31.080 --> 05:37.160
So, you have a sensor, you have the base station set up, the laser sort of flying around,

05:37.160 --> 05:41.760
your laser hits the sensor, what happens then?

05:41.760 --> 05:48.440
So, the chip has two the lines, one's called envelope, which just tells you, hey, there

05:48.440 --> 05:52.760
is data here, it's just a single in-trop line, and then you have the data line, which

05:52.760 --> 05:54.760
receives a bit stream.

05:54.760 --> 05:59.520
Now, to be specific, this is at 12 megahertz Manchesterian coated differential, this

05:59.520 --> 06:04.480
trip of a pseudo random number of sequence, but for all practical effects and purposes,

06:04.480 --> 06:10.640
it's just a really huge 96 kilobit bit stream that you know, like this is an at table,

06:10.640 --> 06:13.640
you can just look it off exactly which bits go where.

06:13.640 --> 06:18.640
Now, a small parenthesis, because this is important for anyone who wants to implement this,

06:18.640 --> 06:23.440
in my laboratory, we found out that you need to sample this line at least 32 megahertz

06:23.440 --> 06:29.800
otherwise the signal is just not recoverable, 12 megahertz is aggressively fast for

06:29.800 --> 06:36.120
a signal, you have to like sample on GPIO, you have about 80 nanoseconds between pulses,

06:36.120 --> 06:40.440
and this turn out to be the mainly mutation of implementing this technology on a particular

06:40.440 --> 06:47.440
low power micro processor, if your MCU can do 32 mega samples on a GPIO, they're good

06:47.440 --> 06:50.760
to go, if it can't, you are going to have to find another one.

06:51.240 --> 06:57.400
Pardon, this is close. So, when the base station starts to sweep, it starts at the right,

06:57.400 --> 07:02.200
and it starts sweeping, and it starts sending the bit stream, it starts bit 0, 1, 2, 3, 4, 5,

07:02.200 --> 07:07.160
blah, blah, blah. The base station rotates at constant speed, and the bit stream is transmitted

07:07.160 --> 07:12.600
also at constant speed. So, by the time you hit about 90 degrees, you're already at

07:12.600 --> 07:18.120
bit 48,000. It keeps going, you keep receiving a part of the bit stream, and by the time

07:18.120 --> 07:26.040
you finish the health rotation, you are at 90 size K, roughly bits. So, you have a sensor,

07:26.840 --> 07:31.720
it gets hits by the laser, and you receive that bunch of data. You grab that data, and you go to

07:31.720 --> 07:35.880
the gigantic look-up table and check, what part of the bit stream did I just receive?

07:37.320 --> 07:42.200
For example, if you notice that you start counting and you say, okay, I got bit number 24,000,

07:42.200 --> 07:46.120
you can get a very direct correspondence, that that means that you just receive

07:46.840 --> 07:52.680
a laser was transmitted at 45 degrees from the base station, and that's the whole point of the

07:52.680 --> 07:58.280
bit stream. This is just a very complicated way of telling the sensor, what was the rotational

07:58.280 --> 08:08.360
position of the lighthouse when the laser hits the sensor? Why? Because, if you have, for example,

08:08.360 --> 08:13.400
two base stations, position at no locations, and one of them is reporting 30 degrees, and the

08:13.480 --> 08:18.920
other one is reporting 110 degrees, then the only possible position where the sensor could be

08:20.120 --> 08:27.320
is this. Then, since we also get not only horizontal, but also vertical, and got information,

08:27.320 --> 08:31.640
we can also extend these two three-depositioning, and thus you can do a full triangulation,

08:31.640 --> 08:39.320
and you get the position of your sensor wherever it is. Okay, couple of text backs, we need to

08:39.400 --> 08:44.840
talk about a little caveat. This is very nice, but we just talked that the lighthouse

08:44.840 --> 08:50.440
actually projects are v-shaped, this doesn't do like a little cross, how do we correlate those two things?

08:51.400 --> 08:54.760
So, if you put yourself on the position of the lighthouse looking forward,

08:55.480 --> 09:01.160
what you see is a little v pattern going from right to left, and what you get on the sensor is the

09:01.160 --> 09:07.560
rotational position when those planes hits the sensor, that are called alpha one, and alpha two,

09:07.640 --> 09:12.120
and to some confusing looking, but not actually a difficult trigonometry,

09:12.920 --> 09:19.400
you can actually translate that to both affimus, horizontal angle, and elevation, vertical angle,

09:19.400 --> 09:25.080
and we are back to doing triangulation. Now, small summary of the whole thing,

09:26.040 --> 09:31.400
you have a lighthouse, it reacts a v. You have a sensor, it receives it and it gets a bit stream,

09:31.400 --> 09:35.480
from the bit stream, you calculate the affimut and elevation, and from then you're trying to

09:35.480 --> 09:43.960
regulate the position of the sensor. That's pretty much it. Except, you know, it's kind of a

09:43.960 --> 09:48.760
hugely mutation that you have to know the position of the base stations to be able to compute the

09:48.760 --> 09:53.240
location of the sensor. You're not going to go on to a certain time measuring where you put the

09:53.240 --> 09:57.160
lighthouse with like millimeter precision just to get the thing working, that makes no sense.

09:59.240 --> 10:05.160
So, as fixed up, thankfully, this problem was solved for us about 30 years ago,

10:05.160 --> 10:10.360
by computer vision researchers, because this is a problem, as a stereo vision calibration,

10:11.000 --> 10:14.920
and in the next slide, we're going to see that through some little mathematical intuition,

10:14.920 --> 10:21.160
we can actually equate a lighthouse-based station to a tingle camera, and since we can

10:21.160 --> 10:25.000
treat us, the camera that means we can take advantage of the case upon the case of

10:25.000 --> 10:28.440
research and computer vision to solve all of these for us.

10:31.000 --> 10:33.000
A little possible water.

10:36.120 --> 10:37.160
Wonderful.

10:37.160 --> 10:41.960
Charm, imagine this, you have a station, you have a sensor, you have the line that connects the

10:41.960 --> 10:46.360
both of them. Do you know the angle of the slide, because that's what we've been getting from the

10:46.360 --> 10:52.840
bit stream? Imagine you have a camera, like an image plane, like you put in front of the lighthouse,

10:53.400 --> 10:59.160
and then you intersect the laser going from the base station to the sensor with that plane.

11:00.040 --> 11:03.720
Looking at it on 3D, it looks something like this, you have a base station,

11:03.720 --> 11:09.400
overlooking some robots moving around, and we want to transform as smooth and elevation angles,

11:09.400 --> 11:14.520
which are kind of unwieldy into pixels. How do we have this information as pixel,

11:14.520 --> 11:18.760
because pixels are nice to work with? Well, you intersect them with that imaginary plane,

11:18.840 --> 11:24.120
and whatever they hit, they are your pixels, and you have a lighthouse image.

11:25.960 --> 11:28.360
The position of the pixels is related to the

11:29.560 --> 11:33.560
afimoting elevation of the angles you were doing, and the K is the

11:33.560 --> 11:38.120
intrinsic camera matrix, the only thing it means is that we're working with an idealized

11:38.120 --> 11:44.360
ping-ho camera model. It's a really simple model. Once you have this done, this is the crux

11:44.440 --> 11:49.560
of light-hospitalization. The triangulation is nice. If you consider the camera,

11:49.560 --> 11:55.720
then you have access to way, way more techniques for it. Like for example, a star revision.

11:56.920 --> 12:02.760
Now, the star revision problem can be follows. You have a real-world object in some location,

12:03.160 --> 12:07.560
and you have a view of that object from two different cameras and two different locations.

12:07.560 --> 12:13.560
Camera based stations, where we want to call it. And the question is, can we get from

12:13.560 --> 12:18.680
both perspective? Can we calculate the rotation and translation from one camera to the other?

12:19.240 --> 12:22.840
Because knowing this is the same as knowing where the based stations are located,

12:22.840 --> 12:27.240
as long as you know that, you can go back to triangulation. The problem is solved. Once you know

12:27.240 --> 12:31.160
that, also, you can do a lot of really cool stuff. Like for example, you can do photogrammetry,

12:31.160 --> 12:36.360
really reconstructions, or doing like depth estimation based on your images. Really cool technology.

12:37.880 --> 12:43.720
Let's explore two ways that we try and we know work for doing this with light-hospitalization.

12:45.240 --> 12:51.080
Number one is 3D essential matrix reconstruction. It's not as bad as it sounds. There is

12:51.080 --> 12:56.600
the delay rate that does it for you. You have two ways stations. We are looking at a robot that

12:56.600 --> 13:02.840
has only a single sensor. This is for a single sensor reconstruction. If you have at least seven points

13:02.920 --> 13:07.880
that you know correspond between both views, you can use go-to-open CV and calculate the

13:07.880 --> 13:14.040
essential matrix. The essential matrix is the matrix that relates both views of the same object.

13:14.040 --> 13:16.680
There is a function for that. You need to use yourself only to understand the math.

13:17.400 --> 13:21.560
You need at least seven points and they must not be complainer. The function,

13:22.120 --> 13:26.760
a speed sound of matrix, and then you can do a single about the composition, and it gives you

13:26.760 --> 13:36.920
R&T. Where the stations are, you can do triangulation. Now, if you have more than one sensor,

13:36.920 --> 13:40.520
for example you have four, you can do something that's what a prospective endpoint problem,

13:40.520 --> 13:45.960
which is the following. You have one or more cameras looking at an object that you know,

13:45.960 --> 13:50.360
you know the dimensions on shape of this object, or you have known markers on it.

13:51.480 --> 13:56.200
If you know where you are, and what's supposed to look like, you can guess from the picture

13:56.200 --> 14:00.520
where the picture was taken from. It's a very simple concept. If you have a car and you have a

14:00.520 --> 14:04.360
camera and you take a picture of the car, if in the picture you can see the trunk,

14:04.360 --> 14:09.480
you are not taking the picture from the front. You are clearly behind the car. If you know the

14:09.480 --> 14:14.600
geometry of the object with high precision, then you can get really high precision on where

14:14.600 --> 14:20.760
the camera was when it took the picture. Say, for example, you have this PCB that has four

14:20.760 --> 14:25.560
light host sensors. In four locations, if you know where they are, you take a measurement of each,

14:25.640 --> 14:30.600
based on how it looks, you know where the camera is, where the cameras are. This also has an

14:30.600 --> 14:36.600
implementation in OpenCB. You need at least four points, doesn't matter if you are a computer or

14:36.600 --> 14:41.800
not, you put that in there, and it will give you the position of the base stations, and again,

14:41.800 --> 14:46.840
you are ready to go back to the triangulation. So, this is all cool enough, a lot of math is fine,

14:46.840 --> 14:52.600
but how well does this actually work? Now, here are some numbers to back. Back now, when I said

14:52.600 --> 14:56.760
works well, this is based on a paper, me and my lab polish, as well as a paper that the

14:56.760 --> 15:04.280
bitkris people polish. Both single sensor and multi sensor tracking gives you about one centimeter

15:04.280 --> 15:09.000
of precision, somethings left, somethings more. But there's more really cool results that

15:09.000 --> 15:15.160
we like to point out that the system is really stable. Like if you put a sensor on a table and you

15:15.160 --> 15:19.720
measure it, the variation and the measurement, like the data of the measurement, it's less than

15:19.720 --> 15:23.640
a millimeter, which means that you can put like two sensor, right next to each other, two meters

15:23.640 --> 15:28.840
away from the base station, and the base station will be able to tell them apart, which is

15:28.840 --> 15:40.520
really cool for like micro-precision positioning. This climber, these numbers were taken on a two-by-two-by-two

15:40.520 --> 15:44.920
meter cube space, so it's not like we are talking about, yeah, we are very precise in this time

15:44.920 --> 15:53.560
by 10 centimeters, no, no, that can be done in like this area. Cool, that's cool and all.

15:54.440 --> 16:03.160
I'm so, I won one. Where can I bite? So I have good news and bad news for you. Good news,

16:03.160 --> 16:10.760
lot of open source work. For those who have electronic hardware production capabilities,

16:11.720 --> 16:16.920
here is an implementation of a breakout board for the sensor chip. This was designed by myself.

16:16.920 --> 16:22.520
Starting a special, it's just the, it's just a reference design on the data sheet, but if you

16:22.520 --> 16:28.360
once a PCB that's been known and tested to work many, many, many times, here you have our reference.

16:28.360 --> 16:32.680
Please take it. It's made in Kaikav and all the sources and the real material is all open source

16:32.680 --> 16:38.760
and it's all there. I have made many of this, they all work. One little caveat, the TS431 chip,

16:38.760 --> 16:44.200
it's a BGA with like 0.5 millimeter separations between the pads, not easy to solder in this

16:44.200 --> 16:54.440
slide with, you might want to use PCBA manufacturing. Firm or wise. Memalap publish a reference

16:54.440 --> 17:00.440
decoder for the light-hospulsive based on the Raspberry Pi picot. You can grab a Raspberry Pi picot

17:00.440 --> 17:05.640
connect four of the sensors to it and put it in a front of a couple of stations and it will just

17:05.640 --> 17:10.920
start a streaming through serial, all the angle information it gets. It's pretty stable. It works really

17:10.920 --> 17:18.760
well and it's very fast. It takes like 30 microseconds per pulse for for a computation. So if you

17:18.760 --> 17:23.240
want to have a reference decoder, you can go here. If you want a reference on the map behind it,

17:23.240 --> 17:30.200
the link to the paper explaining all of this is in the page for this presentation. We also have an

17:30.200 --> 17:37.160
implementation for the NRF Nordic semiconductors 52 and 53 family, but for hardware limitation,

17:37.160 --> 17:45.000
those can only handle a single light host sensor. That's nice. I don't want to do research on this,

17:45.000 --> 17:52.200
you say? I want a buy one. That's the bad news. The closest thing you can get right now is a big

17:52.200 --> 17:57.800
crisp company sells this very nice light host decks, which I believe we have a talk later today

17:57.800 --> 18:02.040
with someone used them to locate a robot and they have very nice user interface. It's already

18:02.040 --> 18:07.480
done for you. You can plug it in. It's made to work with the drones, but it works with whatever

18:07.480 --> 18:12.760
you want to put it on. I believe I have permission to say this. They are working on a stand-alone

18:12.760 --> 18:21.000
version of this, which will be out someday, maybe. Hopefully. Now, before we finish, I would

18:21.000 --> 18:26.280
like to say thank you, both to the Bitcoin company and the Leap Survive project because they did

18:26.280 --> 18:33.000
a lot of the, like, hard reversing the nearing effort on, like, getting this technology to work.

18:33.000 --> 18:37.240
I am just the last, on a long list of people who have worked in the business of what you open

18:37.240 --> 18:40.840
source and with other work, none of these will be possible. So, I thank there for the

18:40.840 --> 18:44.920
contribution. And I thank all of you for coming here. I hope you enjoy the presentation.

18:45.000 --> 18:59.000
Thank you. Thank you, Seid. I believe we have time for a couple of questions. Yes.

18:59.000 --> 19:07.720
Yes, it is an interposition system. So, this answer has to be inside of the

19:07.720 --> 19:14.120
the lighthouse permanently. How do you solve this? I sorry, I didn't quite get the question.

19:14.120 --> 19:22.280
The sun's so as to be inside of the lighthouse laser. It should be in the area and according to

19:22.280 --> 19:28.680
valve, the lighthouse projects are 5 by 5 meter projection cone in front of it. So, if you put one here,

19:28.680 --> 19:51.400
it should cover all of the area. I have worked a lot on this come system. You have to be

19:51.400 --> 19:55.800
line of sight. It's an optical system. But this is also why you put multiple bases.

19:57.560 --> 20:01.960
So, your robot, we usually have a cannon feature or some kind of sensor augmentation that we take care

20:01.960 --> 20:09.080
of the dead spot. And by having multiple bases, you will be able to cover more area as well.

20:09.080 --> 20:12.840
But it's very much like a camera or a mock-up. It has to be in the line of sight.

20:13.160 --> 20:21.480
I played around with Leipzig with multiple trackers and I'm using the five tracker.

20:21.480 --> 20:27.080
The problem I always had is if I lose line of sight of one, your whole system de-synchronization

20:27.080 --> 20:34.040
in collapse, you had the same problem or did you solve that? In our particular case, since we're

20:34.040 --> 20:39.880
using this for robot locations, we do what Agno use mention. We don't rely if you're seeing

20:39.880 --> 20:45.720
this, but we use it as a server of GPS. We do that reckoning while we don't have the lighthouse

20:45.720 --> 20:51.080
and at one point we will get one of the pulses and we will be able to calibrate back on that.

20:51.080 --> 20:54.600
The bike trucker has the issue that I think it only relies on Leipzig. I don't know if it has

20:54.600 --> 20:59.400
an internal IMU to do it. Yeah, I think it has, for sure.

21:00.360 --> 21:07.640
Oh, what's the idea here in this? What would it all occur in the store?

21:07.640 --> 21:15.640
Who? I'm currently designing a VR headset, which is powered by Leipzig and here I'm using the

21:15.640 --> 21:24.600
Bitcoin prototype, but in the future I'm going to design my own PCB with more sensors and

21:24.600 --> 21:38.520
Raspberry Pi picot on it. So in the prototype I have 3 PCBs and I want to make a larger PCB

21:39.320 --> 21:44.360
with more than what's and everything combined. Nice. Well, do please take the reference

21:44.360 --> 21:48.440
of the signs if you want to use them for anything?

21:57.640 --> 22:03.160
The latest time we haven't this and them like outside at noon, I'm sorry.

22:03.480 --> 22:13.000
Could you question here so that the laser in the lighthouse is operate in infrared?

22:13.000 --> 22:16.760
So I would like to know if this system works in direct sunlight.

22:19.640 --> 22:24.680
So, yes, they are in fact infrared pulses and they do get interference from the sun.

22:24.680 --> 22:28.760
I haven't tested this system outdoors because it's not meant for outdoors rotation,

22:29.160 --> 22:35.080
but in our lab we do have very large windows and we do have moments of the day where you

22:35.080 --> 22:41.400
got a lot of direct sunlight on top of it. We noticed that you get a bit less range on it,

22:41.400 --> 22:47.320
but the chip self calibrates and removes any DC offset. So it works fairly well even when sunlight

22:47.320 --> 22:51.320
is interference. Haven't tried it outside though, like a noon, no idea.

22:52.200 --> 22:56.200
Go ahead.

22:58.200 --> 23:08.280
Is this one more? No, no, no. Thank you. Did you actually test it on larger

23:08.280 --> 23:12.680
ranges because theoretically you can have 16 lighthouse, which would give you fairly

23:12.680 --> 23:16.520
big range. Did you stretch this out to the max at some point?

23:16.520 --> 23:22.600
The max I have a tester is with the floral base stations. We haven't gone all the way,

23:22.600 --> 23:27.000
but we are working on a project like the whole reason we are doing this is because we want to do

23:27.000 --> 23:32.440
like a thousand robot test bed on a gigantic area. So we will be testing that very soon.

23:32.440 --> 23:43.000
You can see your papers getting posted at some point. Okay. Thank you.

