MLOps Community
+00:00 GMT
Sign in or Join the community to continue

AI & Aliens: New Eyes on Ancient Questions

Posted Invalid date | Views 44
# Gen AI
# Space and the Ocean
# SEAQR Robotics
Share
speakers
avatar
Richard Cloete
Laukien-Oumuamua Postdoctoral Research Fellow @ Harvard University

Richard is a computer scientist and Laukien-Oumuamua Postdoctoral Research Fellow at the Center for Astrophysics, Harvard University. As a member of the Galileo Project under Professor Avi Loeb's supervision, he develops AI models for detecting and tracking aerial objects, specializing in Unidentified Anomalous Phenomena (UAP).

Beyond UAP research, he collaborates with astronomers at the Minor Planet Center to create AI models for identifying potential interstellar objects using the upcoming Vera C. Rubin Observatory.

Richard is also the CEO and co-founder of SEAQR Robotics, a startup developing advanced unmanned surface vehicles to accelerate the discovery of novel life and phenomena in Earth's oceans and atmosphere.

Before joining Harvard, he completed a postdoctoral fellowship at the University of Cambridge, UK, where his research explored the intersection of emerging technologies and law.Grew up in Cape Town, South Africa, where I used to build Tesla Coils, plasma globes, radio stethoscopes, microwave guns, AM radios, and bombs...

+ Read More
avatar
Demetrios Brinkmann
Chief Happiness Engineer @ MLOps Community

At the moment Demetrios is immersing himself in Machine Learning by interviewing experts from around the world in the weekly MLOps.community meetups. Demetrios is constantly learning and engaging in new activities to get uncomfortable and learn from his mistakes. He tries to bring creativity into every aspect of his life, whether that be analyzing the best paths forward, overcoming obstacles, or building lego houses with his daughter.

+ Read More
SUMMARY

Demetrios speaks with Dr. Richard Cloete, a Harvard computer scientist and founder of SEAQR Robotics, about his AI-driven work in tracking Unidentified Aerial Phenomena (UAPs) through the Galileo Project. Dr. Cloete explains their advanced sensor setup and the challenges of training AI in this niche field, leading to the creation of AeroSynth, a synthetic data tool.

He also discusses his collaboration with the Minor Planet Center on using AI to classify interstellar objects and upcoming telescope data. Additionally, he introduces SEAQR Robotics, applying similar AI techniques to oceanic research with unmanned vehicles for marine monitoring. The conversation explores AI’s role in advancing our understanding of space and the ocean.

+ Read More
TRANSCRIPT

Richard Cloete [00:00:00]: Hi, I'm, I'm Dr. Richard Cloete. I am a computer scientist at Harvard University and also running my startup called SECO Robotics. And I take my coffee instant and black, no sugar, no milk.

Demetrios [00:00:16]: Welcome back to a very, very special MLOps community podcast. This one is right out of left field. Today we hit the E brake and skid on by 2025 is coming at me hot. We're talking about UAPs and how Richard is using AI to try and identify random things in the sky. We also talk about his startup, Seeker, and how he is observing things in the sea and his work with the Minor Planet center on the observatory and using AI and ML to sift through the mountains of data that they are getting at the Rubin Vera. Rubin Observatory. That was the name of it. So if you like this, you're interested in what Richard is doing, I highly encourage you to check out the Galileo project that we mentioned a few times or check out what he's doing with Seeker Robotics products.

Demetrios [00:01:29]: Let's get into this episode and woo. Let me know what y'all think of this because my mind was blown. I. I gotta know man, like you're doing some AI that is not in the traditional side of AI, I think, or when people think AI these days, they don't necessarily think about the AI that you're doing, but you're doing such cool stuff and I want to know like what drew you into the world that you are in right now and maybe tell me a bit more about what exactly you're doing and how you're using AI to do it.

Richard Cloete [00:02:12]: Okay, well maybe we should start with a bit about the Galileo project, an overview of what that is all about, because that's where I'm doing the AI, so it'll give some context to what we're doing. So at the Galileo project we build in a sensor suite of sensors and the aim is to essentially monitor as much of the electromagnetic spectrum as possible. So we've got infrared cameras, we've got an all sky visible camera monitoring the sky as well. We've got acoustic sensors, we've got things like a magnetometer as well and a few other sensors. And the idea is essentially see if we can detect anything in the sky that is unusual. So we obviously expect to see things like birds and airplanes and whatever else might fly through the sky. But the main point is to look for a uap. So I don't know if you're familiar with UAP and what's really recently been going on in the media and in government as well.

Richard Cloete [00:03:19]: But essentially we know that there's something going on and someone is not telling the truth. We don't know whether it's the government or the whistleblowers or whatever, but we don't care. As scientists, we just want to put instruments on the ground, collect data and let the data speak for itself. That's the purpose of really the Galileo project. Get some data and we'll see what happens from there. And one of the core systems to the ones that I'm working on is known as the Dalek. It's a hemispherical array of, of about eight cameras with a camera on the top of all of Zenith. And I'm using the video feeds from there to detect, while I'm looking rather at the video feeds to see if we can detect and track any objects flying across the sky.

Richard Cloete [00:04:09]: So my task was really to build an AI model that could achieve this. And when I started, though, there was no real data set out there to be able to work on. Data coming from infrared cameras looking at the sky. There's a few data sets out there. You can go to Hugging Face or wherever, and you can get tons of different data sets for different machine learning purposes. But unfortunately a lot of them have. For example, we want something like an airplane. We want to be able to detect airplanes, a data set of airplanes.

Richard Cloete [00:04:44]: But these airplanes might be on a tarmac or they might be on a kid's T shirt or it might be some cartoon airplane or something. And we don't need that kind of data. We need data from cameras pointing up at the sky with aircraft flying through them. Same with birds. Birds might be two birds on a branch on a coffee mug or something. It's just not the kind of data we need. And so we pretty soon realized that what we needed to do was simulate our data. So we built a tool which we call AeroSynth.

Richard Cloete [00:05:21]: So synthetic data, basically. And we used Blender3D, so that's a 3D rendering modeling application, which you're probably familiar with. And, and we use a bit of Python and grabbed a bunch of 3D models, open source 3D models, and basically distribute them in the scene, randomly distribute them in the scene, within the viewfrost room of the camera and then render them out so you end up with images of airplanes or birds or a mixture, balloons, blimps, drones, all those kind of things you might find in the sky. And you get a mixture of those in these images at different orientations, different distances from the camera, so that they are different sizes under different lighting conditions. So we have lots of really good data sets that we then use to train Yolov 8 models. Well, Yolov 5 models in the previous paper, in the previous round, but now we've moved on to Yolov 8 and Yolov 11, which has just come out as well. And so we've trained several different flavors of the models on these data sets. And then we use a bunch of different tracking algorithms as well.

Richard Cloete [00:06:40]: So we use things like just a regular sort algorithm, which is simple online real time tracking, byte sort. And I think Viola comes with a few as well. So we've tried a few of those and essentially what we're doing is we're detecting objects in, in the video feeds and then tracking them. So we're extracting their trajectories and then we're also working on algorithms to be able to look at these different trajectories because that's what's really interesting. I mean you might have an object, but it might just be a little white dot, you know, it might be far away. We don't yet have distance, so we're working on triangulation. But the trajectories are what's really interesting because that'll tell you whether something is moving in a way that doesn't conform to typical known man made or natural phenomena. And so the idea is that we get alerts generated every time the system identifies something unusual.

Richard Cloete [00:07:41]: But we're not at that point yet. We've got the detection and tracking algorithms pretty nailed. There's always room in machine learning obviously to improve on your detection accuracy and that's an ongoing project and I suspect it will be for some time. But we are able to robustly extract out detections and tracks, which is really cool to see.

Demetrios [00:08:06]: And how many of these different apparatuses do you have around and where are they?

Richard Cloete [00:08:12]: We have at the moment three sites that we. I think it's three sites. My memory is quite bad, so I think it's about three and I can't really say where they are. One of the big questions we had around the Gallery project when we started deploying to different sites was like, oh, can we go and tell people where we put new things? And the consensus was generally no. And the reason for that is because there's so much like stigma and sensitive issues around the UAP UFO subject. We don't want to give away locations of our sensor apparatus in case you get some people out there who say, oh, all right, I'm going to go over there and I'm going to try and trick them you know, and, you know, mess with our data streams or mess with the cameras or just even vandalize the instruments. So for the time being, those are all kind of secret.

Demetrios [00:09:10]: Yeah. Well, hopefully you had one in New Jersey over the last month or two because I heard it was a pretty big hot spot there.

Richard Cloete [00:09:18]: Yeah, it was. There was a lot going on over there. I'm not entirely sure what was going on, but we do have some portable systems. I'll just say that. Yeah.

Demetrios [00:09:29]: Oh, cool.

Richard Cloete [00:09:30]: Yeah.

Demetrios [00:09:30]: So you can go almost storm chasing when there's things that are happening and you get a hotspot and what made you want to get into this and recognize that you could a combined these models with the identification and basically put AI to use at this task? Because I don't think I've heard a use case quite like this in the last five years. I've been interviewing people and that's one of the reasons that I wanted to get you on here, is because it is so unique.

Richard Cloete [00:10:07]: Yeah. So, I mean, there's so much data, so many videos feeds that we have to process and it's just not feasible to have someone sitting there monitoring 24, 7. And on top of that as well, it's very often the case that the object might be very small and it's hard to discern exactly what it is. And humans are really bad at classifying. When you see things in video feeds or in images, people make all sorts of mistakes. One person says it's a bird, says it's an airplane. So we don't want to rely on humans to be able to process that data. Instead, we just build the machine learning model to do it for us.

Richard Cloete [00:10:48]: That way we get consistent answers across 24, 7, 365 days a year. And it's much faster as well.

Demetrios [00:10:57]: Yeah. So the obvious question is, have you found anything?

Richard Cloete [00:11:03]: There's been several cases where we've seen something that's interesting, but I must stress that at the moment the systems are not fully operational in the sense that they're out there continuously processing data, sending us alerts and detecting anomalies. We're not yet at that stage. However, we are in the commissioning phase. So we're at the moment deploying the instruments to the ground, leaving them out there, collecting data, analyzing that data. And the goal is really just to see if the data is what we expect, to see if the systems are performing the way we expect them to perform. And given the baseline as well, that's really important. We're trying to get a baseline of what's normal for this area. And that's different for every area.

Richard Cloete [00:11:49]: One area might have more birds or more aircraft if it's next to an airport or something. And so just through, like, manual review of the data and testing algorithms, yeah, we pulled out some things that look really interesting. And unfortunately we haven't been able to explain. We can't say that they are a uap, and we can't say that they are a bird or an airplane just because there's not enough data. And in a lot of these cases, we've only got like one video, you know, so even though there's lots of cameras recording, it might be that at this particular time those cameras were down or our acoustic system wasn't working, or there was some other issue. And so we don't have the kind of data we need to make a proper verification, which is data from multiple different instruments all corroborate in the exact same event at the same time. And a key component of all this that's missing, which is actively under development, is the ability to triangulate. Because if we can triangulate objects, we can get their distance, and then from there we can extract things like the speed, velocity, and so on and so forth.

Richard Cloete [00:13:02]: And that's something we're working really hard towards. And we've made some really good progress in there as well. But until we get that information, it's difficult to even see. We can't say how big the object is. So it could be something which is really far away and huge, or it could be something small and close. But yeah, watch the space because there's definitely some interesting things coming out. We just need to have the right process in place to make some verifications before we can talk more about them.

Demetrios [00:13:34]: And it feels like you're. You're doing a lot of stuff out there on the edge, devoid of communication with any. Any cloud services. And so it's just gathering this data and then someone, or somehow you're batch downloading it and then able to process it.

Richard Cloete [00:13:54]: Yeah, so everything is running at the edge. And we're using. We have a Starlink system set up there as well, but we also have 5G, so we have the ability to egress data automatically, but not for like, live video feeds and things continuously. So instead what we do is we store the data locally, and then someone will go there and collect the data for like the previous week or the previous two weeks or whatever. They'll bring it back to the center for Astrophysics, and then they'll hook it up to our systems there and then transfer that data to our cluster at Harvard where we can then do the analysis.

Demetrios [00:14:39]: How do you separate the wheat from the shaft? Where I'm sure you get a lot of really interesting types of folks that want to come and do stuff here with like just saying aliens makes I think, a certain type of person be really excited and another type of person be turned off. But you're doing it, it seems like, I mean, just talking to you for the last 20 minutes, you've got two feet on the ground. You don't look like a few of my friends growing up that wore alien shirts. And, and so how has it been being in almost like in the scientific community trying to do stuff in this field and staying on the straight and narrow, if we can call it that.

Richard Cloete [00:15:31]: Yeah, you know, there's so much stigma around the, the UFO UAP subject. It's been going on for decades already. And you'll, if you see a news segment, you'll likely hear the, the X Files music playing in the background or something like that. Yeah, there's just no need for this stuff. So we're taking a completely scientific, pragmatic approach here. We're not trying to make any assumptions about what we might or might not find. We're just saying let's get the instruments out there, put them on the ground, collect the data and then we can talk about the data. We'll open source that data and we can argue about what it might be or might not be.

Richard Cloete [00:16:06]: Let other scientists interpret that data. And so all my conversations that I've had with, whether it would be with peers or whoever have been on the same, the same line of thought and we've not really had any pushback against that. There's some people who of course have a little bit of a, you know, these guys are crazy mindset, but I mean, so what, you know, if you, if you don't look, you're not going to find anything. So let's get out there and have a look. At least, at least we can say we tried. And when it comes to bringing people onto the project, of course we do interviews as well. So we sit down with a person and chat with them to just make sure that they're not on the other side. The UFO community is a bit of a mixed bag and we bring in, generally we bring in technical people because that's where the real work is.

Richard Cloete [00:17:07]: And there's still a lot, still a lot to do there. We're always short on people. It's a completely volunteer based project. So we're always looking for additional help from that front.

Demetrios [00:17:17]: That's excellent. Okay, so that's UAP stuff, which I find absolutely fascinating. And I imagine it keeps you up at night and you love working on it and that's why you're doing it. There's also other really cool things that you're doing that I want to get into. One, being around like observatories and how you are seeing AI being used with telescopes and being used like with images from deep space and all of that. Can you go into some of that?

Richard Cloete [00:17:51]: I work very closely as well with the Minor Planet center. And one of the big tasks that we're looking at now is there's going to be a new telescope coming online very soon called the Vera Rubin Telescope or the lsst. And this telescope will collect a lot more data than previous generation telescopes. It'll see a lot more of the sky and we'll see much further. So you can imagine going into your backyard at night with a flashlight and you can only see so far. And then you come along with a bigger flashlight and you can suddenly see much further. And there's a whole lot of new objects, new things at the back of your garden. And it's the same kind of thing with this new telescope coming out online, we'll be seeing so much further and there's so many new objects, and not just different varieties, but so many more of them.

Richard Cloete [00:18:46]: And so one of the things that kind of recently picked up a little bit of traction is that of interstellar objects. We've had Oumuamua, which was the first interstellar object to enter our Solar system in 2017. Well, the first one detected, let's say. And so working with the data from these telescopes, I'm building models to be able to identify which ones, let's say flag which ones might be interstellar objects.

Demetrios [00:19:21]: And an interstellar object is just something that comes into our solar system for a little bit and then it leaves.

Richard Cloete [00:19:29]: Yeah, that's right. Could be. It could be anything. It could be a rock, it could be a spaceship. Who knows, you know, it could be anything.

Demetrios [00:19:35]: So keeping up with the theme.

Richard Cloete [00:19:37]: Yeah, I like it.

Demetrios [00:19:39]: This telescope that you're talking about, the Vera Rubin, it's in outer space?

Richard Cloete [00:19:44]: Oh, no, it's. It's ground based.

Demetrios [00:19:46]: Okay.

Richard Cloete [00:19:46]: Wow. Yeah, I think it's in Chile.

Demetrios [00:19:49]: But it's more powerful because. Just because of the technology.

Richard Cloete [00:19:53]: Yeah, it's got a much bigger aperture as well. It can just see much further into space than regular ground based systems.

Demetrios [00:20:02]: There's also the one that is in space.

Richard Cloete [00:20:05]: Right.

Demetrios [00:20:06]: The James Webb or the Hubble. And that one is fascinating because it gets all of the ozone. I had thought that the ozone was a bit of a pain in the butt when it comes to the telescopes because it can muddy up the vision or the image.

Richard Cloete [00:20:26]: Yeah, honestly, I'm not a big telescope buff. I just work with the data. So I can't really describe much about the, the technical aspects of the telescopes, but there is definitely atmospheric disturbance, which, which plays a role, and you have to factor that in. But the, the VERA telescope, it'll image the entire southern sky like two or three times a night. I can't remember. So that's in contrast to something like the James Webb or the Hubble, which is much more like narrow field. Narrow field of view.

Demetrios [00:21:00]: Okay. Yeah. And I see the difference there. And so the idea is like, you're going to be creating different models that are going to help you identify when things come interstellar. Is there other models that you're creating?

Richard Cloete [00:21:18]: Yeah. So it's not just actually interstellar objects. We're trying to classify as many different types of objects as we can. So data typically gets submitted to the Minor Planet center. And this data can come from other territories, it can come from amateur astronomers or whoever, and it goes through one of their pipelines. And then essentially before it hits onto their website, we want to be able to intercept that data. So we're working with the Maya planet to be able to look at this data and identify whether one of the objects is an interstellar object, whether it's a main belter, Jupiter Trojan, Mars crosser, whatever, whatever. So there's tons of different classes of objects out there.

Richard Cloete [00:22:04]: And just by using machine learning models, we can kind of separate the different types of classes. There are already tools out there which work towards this. There's a good one, actually, called Digest 2, which is a software that I've been working with as well. It kind of operates as a binary classifier. And what happens is people submit an object to the Mylar Plant center and it produces what's called a digit 2 score. And this score ranges from 0 to 100. And if it goes over 65, it's classified as something interesting, basically a near Earth object. And it goes onto their website and into the database.

Richard Cloete [00:22:49]: Anything else, like a main belter or Mars crosser, we don't really care about those because the Minor Planet center is mostly focused on threats to Earth near Earth objects. So sometimes an interstellar object can be a near Earth object. Oumuamua was classified as a near Earth object. It was detected and only discovered really, because it was detected as a near Earth object. It was coming towards Earth and they think, hey, what's that? Let's take a look. And people started investigating more and then they thought, okay, there's some unusual properties about this. And that's how it all kind of blew up and got all the media attention it did. But there's just so much data coming in there.

Richard Cloete [00:23:33]: You can't have guys sitting there manually processing this data and looking to see if they spot anything unusual. So while this tool that we have at the moment, Digest two, works really well, there's a lot of false positives, right. So in many cases, what happens is an Neo is missed. So it'll have a low digest 2 score, for example, and as a result, it'll get kind of like kicked out of the system and people won't pay attention to it. And that could be for a variety of reasons. The problem is that we need to detect as many Near Earth objects as possible because they might be a threat to Earth, could be on a collision course with Earth. The things I'm working on at the Minor Planet center as well, is improving their system to be able to, with better accuracy and more confidence, be able to predict whether the object is a near Earth object or not. And we should have a paper coming out probably a couple of months, I would say, on that.

Richard Cloete [00:24:34]: But yeah, machine learning is playing a big role in this because there's just so much data. And the models today are really good at differentiating between different classes of objects, especially in this kind of data that we're working with.

Demetrios [00:24:50]: And are you doing the same thing where it is trying to simulate different visual aspects of it, or is this more of like tabular data?

Richard Cloete [00:25:00]: That's right, yeah, yeah. It's just purely numerical at this point. We're not looking at any images, though we would like to at some point start incorporating that image, that data as well, because there's a lot you can get from the image data. You can get like light curves and things like that. So, yeah, in the future we hope to be able to do that.

Demetrios [00:25:23]: Nice. Well, I also want to talk about the ocean exploration stuff you're doing. And the Seeker, I think is what it's called. Right?

Richard Cloete [00:25:34]: Yeah.

Demetrios [00:25:34]: Robotics is super cool. So first of all, like, it makes complete sense to me why you would be very excited and interested in exploring the outer space and then also wanting to explore the ocean, because there is so little that we know about it. So tell me A bit about Seeker Robotics.

Richard Cloete [00:25:59]: Wow. Okay, so Seeker Robotics is a company that I recently started. It follows the same kind of idea as a Galileo project. Right. So in a Galileo project, we're looking at the sky. We're trying to detect objects that are moving through the sky and track them and classify them. But what I realized is that there's a lot of situations, events that happen around UAP, UFOs at sea. Okay, that's Constantine in the stories as well.

Richard Cloete [00:26:34]: Something coming out of the water, something going into the water. The three Navy videos that were released by the Pentagon, I think in 2017, those are all related to the ocean as well. You constantly hear reports of these objects buzzing nuclear submarines or aircraft carriers. Nuclear powered aircraft carriers. And so what I wanted to do was maybe move some of the technologies that we had developed at the Galileo project over to the ocean, have a look what's going on there. There's so much out there and it's just not being monitored. Right. Like 70% of the, the planet is covered with, with ocean.

Richard Cloete [00:27:11]: And there's so few assets out there actually doing proper, robust monitoring. Of course there's things like buoys and there's this manned research vessels out there, and there's the military and fishing or whatever, but nobody's really sitting there paying attention to consistently monitoring the sky, the surface and the water below. And so the idea was to take the knowledge that I'd gained from through the Galileo project and build a network of these. They're called USVs, uncrewed surface vehicles. And you can Google what they look like online. They typically like a surfboard kind of shaped object. But the idea is essentially it's a floating observatory and we want to have a network of these things so that they are constantly monitoring these different domains and sending back data in real time, so sending back alerts every time something is detected. It could be schools of fish, it could be a submarine, it could be another boat on the ocean, it could be an airplane over a head or a bird or whatever.

Richard Cloete [00:28:16]: And so Seeker is a for profit company. But the idea here is that we can generate a lot of really important data for marine biologists, for the Coast Guard, for situational awareness, for Falcon of Defense, for atmospheric and climate research, for oceanography. There's just so much data that we can collect out there that we can use these data as a means to fund the research and development of these platforms, which we can then develop anomaly detection algorithms on top of that would allow us to maybe look and see if we can find anything unusual. So it's a kind of a two pronged approach to the subject because I've been working with Avi now for two and a half years and he's done an incredible job. But a common theme in all of us is that there's really no funding that comes along because there's still, as I mentioned, there's so much stigma around the subject. There's no NSF grants out there, there's no funding tracks essentially to help scientists do their work, then do the work that they need to do on the UFO subject. So RV has been doing an incredible job raising funding all through donations. Okay.

Richard Cloete [00:29:38]: The entire Galileo project is basically all funded through donations. But long term that's not really sustainable because eventually at some point, you know, we're going to run out of money and there's going to be nobody to step in there and donate some extra cash to us. So we need a sustainable revenue stream. And so that's where I started thinking a bit more broadly about how we can actually build something out there that will generate a revenue stream that would also allow the continuation of UAP research. And then also some of that funding could be funneled back to, I could, you know, seeker. If it gets to a point where it's generating revenue, we could start sponsoring postdocs at the Galileo project or other scientific projects, or we could purchase equipment, donate equipment for them. Or like the Sol foundation, another project based out of Stanford, we could sponsor them as well. So the more companies out there that are UAP friendly, I think the better.

Demetrios [00:30:39]: And so this surfboard, it has all the same sensors that you have with the, the Galileo project ones that you were mentioning before when you were putting them on the ground, like all these different cameras and acoustics, infrared and all that. Yeah.

Richard Cloete [00:30:55]: So it'll have very similar sensors. Of course, the Galio is not, is not monitoring the ocean. So we will have additional things like hydrophones and conductivity sensors and salinity and other sensors. But the idea is basically the same, to just monitor, detect and track and classify in multiple domains.

Demetrios [00:31:18]: And how are you going to keep, or are you going to keep it in one spot or are you going to just let it float about?

Richard Cloete [00:31:24]: Well, that's a challenge. The ocean is a powerful place. So we are looking to use something like the waveglider. The Waveglider is a technology which allows the object to move just through the motion of the waves. So I don't think we'll be able to keep it perfectly stationary. So the idea is to essentially, if you imagine dividing the ocean into just a grid. And then each one of these USBs will be responsible for monitoring one of these grid cells. And to do that, we would basically have the USB just moving in circles within that grid.

Demetrios [00:32:01]: Wow. And so this hasn't gone out yet. You haven't been collecting data yet on anything?

Richard Cloete [00:32:09]: Yeah, it's still very much in the research and development stage. Okay. We have some funding to build our prototype, and. And we're well on our way there. We're actually looking to harness the wave energy because a big problem with existing systems out there is that they can't stay up for indefinite amounts of time just because they run out of energy. So we're looking to be able to harness the motion of the waves to generate electricity and keep us out there for much longer.

Demetrios [00:32:42]: Man, that is so cool. I'm a huge fan of all of this that you're doing. And with Seeker, is there any plan to also try and go under the water?

Richard Cloete [00:32:55]: So, yeah, that's a good question to start off with. We will be monitoring just the ocean surface, let's say enough. In our MVP, we'll be monitoring maybe the first 10 to 20 meters of the. The ocean surface and then the skies above. But as our system evolves and as we get more funding, we do plan to eventually build larger USVs, ones that are capable of mapping the seafloor, but also ones that serve as a mothership, let's call it, and allow to drop probes down and then retrieve them. So one of the big things that has a lot of research going on is the water column, measuring the water column. And so we'd be able to drop probes down, collect samples, maybe go to the ocean floor, rise back up, and dock and offload that data. But it could also serve as a communications hub.

Richard Cloete [00:34:04]: So we want to add things like acoustic modems so that underwater assets, whether it be another ROV or underwater observatories or any other probes that are out there underwater, can relay the information back to our seeker usp, and then we can send that over back to the servers on land. Because typically these systems that are currently deployed to the ocean, they have to surface if they want to broadcast. So they can go out there, collect all the data underwater, but then in order to send the data, they have to surface. And if there's a seeker USB within range, they won't have to do that anymore.

Demetrios [00:34:45]: Oh, yeah.

Richard Cloete [00:34:46]: It's like a 5G network for the ocean.

Demetrios [00:34:49]: Yeah. Yeah. It is weird to me that we haven't done more in the space, especially knowing. I think everybody's probably said it at some point. We're like, oh, we know more about space than we do about our own oceans. And so the idea that we don't just set up different observatories in our ocean is kind of weird to me.

Richard Cloete [00:35:19]: Yeah, yeah, that's absolutely true. True. I mean, there's just so much open space out there, and nobody's out there, nobody's watching it, Nobody has any idea what's going on. That case with the drones in New Jersey you mentioned earlier, those are all coming apparently from the ocean. Where are they coming from? If they're foreign tech, how are they getting there? It means that there's not enough observation going on, not enough monitoring to be able to understand where these objects are coming from and where are they going when they go back into the water. There's also issues around the Coast Guard. They don't have a lot of USVs out there, so there's a lot of situational awareness gaps. A lot of the drug smugglers, surprisingly sophisticated.

Richard Cloete [00:36:06]: They actually have underwater drones. They can avoid manned vessels which are sitting there looking to see if they can see anything underwater. And these, for all intents and purposes, unmanned submarines are smuggling drugs, and we should be able to detect those.

Demetrios [00:36:25]: And the other piece that is fascinating is how the data that you collect, there's going to be demand for that data. I can imagine that. And it's not just going to be for folks like yourself that are trying to find anomalies in what's going on, but also just like anyone who wants to try to figure out, oh, is is X happening with the ocean or is Y? There's going to be so many questions that you now can add a little bit more color to.

Richard Cloete [00:37:01]: Yeah, that's right. And, you know, at the moment, the typical scenario is someone will come up with a research plan, all right, I want to measure the water temperature, sustain, touch, location. And they would go and they develop a proposal around there that submit that they'll get the money, but then they'd have to maybe either go and purchase the equipment and manually, you know, deploy it and retrieve it and maintain it themselves, or they'd have to hire a company to go out and do it. There's a lot of cost and a lot of time overheads involved in all of that. And. And so what we're going to offer is essentially a USV as a service. So these things will be out there on the ocean. And you say, oh, we need this data.

Richard Cloete [00:37:44]: Maybe the data is already Available you can just access through a subscription plan or purchase the data package from us. Or you could take control of a fleet of USVs and Go and conduct a mission. Plan it all from the comfort of your office or your home and don't have to worry about the deployment, the retrieval, the maintenance or, or any of that. And that as well. The key thing is that the Seeker USBs will be out there constantly. So there'll be persistent coverage of data. With these typical, with typical missions, you go out there, you collect your temperature data, and then you come back and then we have no idea what the temperature is next week. You know, so there's just, there's just so much, so little observation on the ocean.

Demetrios [00:38:30]: Yeah. And are you not afraid that the, the traffic in the ocean is going to be a problem?

Richard Cloete [00:38:38]: No shipping routes are known. You don't have, you know, ships just going off wherever they want. They want to take the optimal path when they're transporting cargo. And that's something maybe we can help improve on as well with the data that we collect. There will be cases, of course, where we might have to cross a shipping lane or something like that, but these systems will all have on them what's called, I think it's AIs or Automatic Identification system. So all marine vessels need to have this system where they broadcast their position and some other metadata, like what they are and things like that, and also be able to receive. So all the secret drones also have on them obstacle avoidance abilities, of course, because we'll have the ability to analyze the camera feeds anyway and do object detection and tracking. So I don't think there's going to be any issues.

Richard Cloete [00:39:34]: I think there's more probably an issue going to be with maybe sharks. Yeah.

Demetrios [00:39:39]: Especially if it's shaped like a surfboard then.

Richard Cloete [00:39:42]: Yeah, I also be a little bit different shape. It's going to be more like a, like a catamaran kind of shape. But yeah, sharks, you know, if there's something dangling from it, who knows? And we've spoken with another research institute and they operate some of these USBs and they brought one back off the ocean once and they found a shark tooth embedded from the side of it.

Demetrios [00:40:14]: Oh, wow. Yeah, I imagine not just sharks, but a lot of different creatures are going to want to go up to it, get to know it, poke it a bit, figure out what it is, try and dunk it or take it home with them.

Richard Cloete [00:40:30]: Yeah. And we plan to have live video feeds as well. So eventually, I mean, at the moment, you know, there's starlink out there, which is what's going to be transmitting our data. But in the near future, I think Elon Musk has mentioned that there's going to be a next version of Starlink and it'll have capabilities of 10 times more bandwidth than current systems. And from that point we'll probably be able to offer live video feeds directly from anywhere in the ocean, essentially. And so to make some good entertainment, being able to go to a website and click on play button and get a live feed of the Arctic. Or maybe you get a notification that some shock is attacking your, your, your equipment. You can get a live camera feed in there.

Demetrios [00:41:12]: Oh, or just the school of fish swimming by like you said. If it's able to tell you all of that and it can identify it like you were saying, then that would be incredible to look at. And one for the next Planet Earth series.

Richard Cloete [00:41:25]: Yeah, yeah, lots of data on there.

Demetrios [00:41:29]: You've been getting all kinds of data for a while now. What are some of the most unexpected things you've seen?

Richard Cloete [00:41:40]: Yeah, so the most unexpected one was, okay, picture a camera pointed 30 degrees off the horizon up at the, at the sky. There's a tree coming like that down the left hand side of the, of the frame and an object comes in. It moves with like a constant speed, stops, sits there for a minute, dead still. I can zoom in with the pixels. I can't see any wing flapping, I can't see any rotation of propellers or anything. Sits there and then moves back up, goes to the top, sits there again for like another minute and then just flies straight up and disappears. Of course, we analyzed all the different cameras that we had available to see if we could see that object in other cameras and we couldn't. We looked at the audio and we couldn't hear anything for that time period.

Richard Cloete [00:42:45]: That kind of correlated with what we were seeing. We know that there are red tailed hawks in that area, so it is possible that was just a red tailed hawk. The event occurred at around noon if I'm not mistaken. And that's roughly coincides with the time that these birds like to go out hunting. And they can hover for extended periods of time. So in all likelihood it probably was a red tailed hawk. But I've never seen anything behave like that on a video feed. And because we couldn't see any like, you know, you zoom in on the pixels and you can't see any kind of distortion.

Richard Cloete [00:43:26]: There's nothing moving, it's just static and it's round. So that was what I was going to we thought maybe drone as well, but you know, with drones you have to register where, where you're flying. And the instrumentation that we have there is not far from an airport within five miles, let's say, of an airport. And it's unlikely that there'll be anyone flying any drone around there also because there's nothing out there, just trees.

Demetrios [00:43:57]: Have you seen any other like that that got you? That basically showed, hey, this stopped and then it moved in a way that I'm not used to seeing it moving.

Richard Cloete [00:44:12]: Yeah, there's a few that behave like this, but usually what happens is the objects are. We don't see anything. Like, I haven't seen anything yet, just like zipping off or doing anything crazy, any kind of crazy maneuvers. But we do find things sitting, hovering, and then moving away or sometimes like disappearing. And I'm entirely sure what, yeah, like it gets small and disappears. So we don't know that maybe it's flying away from us or maybe a light goes off. Unfortunately, we don't yet have the triangulation. So until we can get that triangulation, we can't determine the object size and its distance.

Demetrios [00:44:53]: How is the triangulation going to work?

Richard Cloete [00:44:55]: Well, we're going to have to have like two or three other cameras, like, separated by some distance. And then you can basically triangulate and figure out where that object is with respect to those two cameras. It's a parallax effect.

Demetrios [00:45:10]: Man, this is fascinating stuff. And this is so cool to see that you get to work on this every day and get up and think about and look at the data, think about how you're going to capture more data on this. And really I'm excited about Seeker. Once that gets up and online, let me know because it's super cool project. Really think that you're doing a great job with that and I appreciate you coming on here.

Richard Cloete [00:45:41]: Sure. Thank you so much. Yeah, there's, you know, I know there's a lot of interest out there with the UAP and also machine learning and just general kind of technology. So if anyone is out there and interested, you know, feel free to get in touch because we're always looking for extra, extra hands, so.

Demetrios [00:45:58]: Nice.

Richard Cloete [00:45:59]: Yeah. Getting bold.

Demetrios [00:46:00]: And do you see other areas where, besides the Galileo project where you can leverage AI and machine learning to help answer some of these bigger questions?

Richard Cloete [00:46:15]: Yeah, so I think one, one cool idea would be to use the, the actual images coming from telescopes and do some sort of machine learning on those. I don't think much is actually being done in that space. At the moment, the images are typically kind of grainy and black and white. But as telescopes improve, the data fidelity will improve as well. So there might be scope for maybe doing classification on different types of objects in space and also looking for anomalies in that way, or looking to see if they, maybe they, they traverse the sky and then stop. Because that's another thing astronomers are. They're building these algorithms out there to be able to see if something is moving across the sky, but they're not taking consideration that it might stop. If it stops, they'll lose it, or if it takes a right angle turn, they'll lose it.

Richard Cloete [00:47:11]: They've got algorithms designed and built to detect what they know is out there, you know, so more work on kind of anomaly detection algorithms I think is going to be fascinating and something that's sorely needed because anomalies is really what drives progress, right? I mean, you see something unusual and you think, oh, what's that? And you investigate, and it's only then that you actually you learn something. So more work on anomaly detection all around, I think is the way to go both in, in space and on Earth and everywhere else.

+ Read More

Watch More

Obtain New Insights on Model Behavior with Fiddler
Posted Oct 11, 2022 | Views 666
# Fiddler
# Model Behavior
# Observability
# Fiddler.ai
EU AI Act - Navigating New Legislation
Posted Nov 01, 2024 | Views 715
# EU AI Act
# AI regulation and safety
# LatticeFlow
Collective Memory for AI on Decentralized Knowledge Graph
Posted Jan 24, 2025 | Views 772
# AI
# Decentralized Knowledge Graph
# OriginTrail