MoonRF mostly takes care of the hardware and pointing, and then the fun is playing with software and signal processing: https://github.com/open-space-sdr/main
It almost looks as if the EME bounce capability of this antenna is a fig leaf or an afterthought, my own 'applications' list would be a lot of things, but not that.
Ps you don't really need this. A phased array is great for communicating with or tracking fast moving objects. For something as slow as the moon a simple parabolic dish, either manually aimed or with an az/el motor will be more cost-effective. Motors get expensive too with wind and rain and longevity (moving around 24/7) but hams don't moonbounce constantly.
Starlink sats move really quickly through the night sky and it tracks multiple so you don't have interruptions this is why for that purpose a phased array is great. For incidental ham use to the moon it's very interesting tech but not exactly necessary.
Aand of course it's open source, https://github.com/open-space-sdr/main/
Yeah it's impressive and I know hams often spend a lot of money on gear. I don't though (I don't even do HF) but it's certainly cool to see.
But for incidental moon tracking I don't really see the need for a phased array other than the cool factor and the knowledge gained building it. Which are perfectly good reasons to do it of course! Just not technical ones.
Their standalone 4-antenna tiles (https://moonrf.com/updates/) show off some killer apps, like 30 fps spatial RF visualization and NEON-optimized drone video interception.
I'm rolling my eyes at the "Agentic Transceiver" part, though. It is highly doubtful that an onboard AI casually writes, debugs, and compiles a real-time C app with analog video color sync recovery and decode in ten minutes.
Using video interfaces to transfer arbitrary data at high speeds is becoming a common trick for cheap boards with limited interfaces. Video inputs and outputs are generally highly mature and optimized to avoid dropping frames because everyone wants reliable video. Putting arbitrary data into video IO pipelines is a cheap way to get high speed IO through standard interfaces.
There is a cool project that uses cheap HDMI to USB capture devices for high speed data transfer out of cheap FPGA boards that have HDMI output [ https://github.com/steve-m/hsdaoh ]
In a perfect world, using PCIe directly would be a much better solution for a project like this. Having access to PCIe DMA support directly without relying on video IO peripherals is helpful for high speed ADC/DAC applications like this. It would also make the board more portable to other SBCs.
The ECP5-5G can do PCIe 2.0 x2 or PCIe 1.0 x4 which would provide around 8Gbps of data transfer. The problem is that the Raspberry Pi 5 only exposes a single PCIe lane to the user. The other 4 PCIe lanes of the Raspberry Pi 5 SoC are routed to the RP1 chip, which has the MIPI and CSI interfaces that are used in this project. So the data is going through a convoluted path instead of being connected to PCIe directly.
I would have to look at the details more closely, but even using the PCIe 2.0 x1 port (around 4 Gbps after overhead) on the Raspberry Pi would be close in bandwidth to the 5.6 Gbps number they give for their custom MIPI solution.
I think the Raspberry Pi 5 is a good first choice for most projects because it is widely support and has the largest community, but for a project like this the benefits of moving to a different SBC with PCIe 2.0 x2 would have been helpful. Keeping the project semi-independent of the SBC has a lot of benefits.
There is a line in the book Accelerando about how evolution did this with biological vision.
It's basically the highest bandwidth sense we have and evolved AFTER smell (chemical based) and auditory (gas pressure based) senses.
If you use PCIe, theoretically you don't need to reverse engineer how they implemented because you're not at the edge of the spec like they are here.
That said, I've thought about doing what they're doing countless times and it is nice to see it would work.
In the multi-tile array it apparently still only needs one Pi [1] as the FPGAs do the heavy lifting.
It says 1W TX power per antenna. So the 240 antenna array which draws 1500W has a transmit power of 240W.
I am sort of skeptical of the claimed gain... even at 6GHz, you need a 2-meter parabolic reflector to get 40dB, the array is 1/10th that diameter. EDIT: Ignore this second paragraph I misread the spec page.
And yeah, the agentic stuff is dumb, I've played a ton with doing low level SDR work on Opus 4.6 and it's truly ass.
Also, the "can't radar, plz don't ITAR" is horseshit. Some basic fw tweaks and you could get this to be, at the very least, a sweet FMCW setup.
My assumption is that they're trying to avoid crossing a legal line, as opposed to being personally invested in the idea of preventing radar use by a determined hobbyist.
That's a lot of juice for 12VDC
> That's a lot of juice for 12VDC
Indeed it is. It's 125 amps, which apart from car starting motors is essentially unheard of because of wiring losses. I think the article somehow got this wrong.
At these power levels, rational designs raise the source voltage, then down-convert closer to the loads.
What I find more interesting than the license question is the software side. They mention a pre-loaded SD card with SDR applications, which probably means GNU Radio or something built on top of it. If they release the beamforming DSP pipeline as open source, that is genuinely valuable -- most phased array signal processing code is locked behind defense contractor NDAs. Having a reference implementation that people can study and modify on commodity hardware at the 399 dollar price point would be a significant contribution to the SDR community regardless of when the repo goes live.
It's also why pictures of modern naval vessels show flat panels instead of rotating parabolic antennas as in past decades. The panels contain advanced phased-array radars.
Which is why I ask. I'm not a lawyer, but there could be a general dual use ban, but with some other regulation that exempts e.g. UK.
Don’t use this in Iran.
When the page says "uh… do not use this to build a phased array radar… even though you could. And if you do, then in no way were we involved. Just don't", this is extremely likely to be about ITAR.
This implies it's about operating a radio transmitter.
Iran will absolutely frown on that right now, as they've frowned on Starlink. Their internet shutoff indicates "empowering the public to connect across the world" is not really what they want.
[1] https://www.gesetze-im-internet.de/afuv_2005/anlage_1.html
[2] https://www.ecfr.gov/current/title-47/chapter-I/subchapter-D...
This is a clue from their webpage: "Not intended for radar applications. Core functionality needed for radar not included due to export control restrictions."
or does it still need industrial grade lasers?
Not only is the moon >100x farther away (even accounting for near-horizon satellite angles), but you're also trying to bounce a signal off it as a passive reflector, which is harder than just transmitting something an active lunar receiver could detect and re-transmit back.
Meanwhile... the RPi alone will probably make up 299 dollars of that price tag [1].
It is not a good time to design hardware that needs RAM. Arrest and imprison Sam Altman.
[1] https://www.jeffgeerling.com/blog/2026/dram-pricing-is-killi...