[Ground-station] Baseband => decimation - questions

Scotty Cowling scotty at tonks.com
Sat Feb 2 13:11:14 PST 2019


This is a great discussion, sorry I have arrived so late in the thread.

Since I am a FPGA and a hardware guy, my comments are hardware-centric. 
Probably makes me odd man out, but here goes...

I greatly favor a single ADC at a higher rate (than dual ADCs clocked in 
phase or quadrature). The I and Q samples can be generated 
mathematically inside the FPGA without the analog mismatch problems 
associated with trying to build two identical copies of analog and ADC 
hardware. The quadrature mixers are implemented in digital hardware to 
whatever precision the hardware allows. Of course, there is a price to 
be paid. The ADC has to run twice as fast, and wider NCOs and 
multipliers use FPGA fabric and require fasted logic. Both cost $$

While the AD9361 is a fantastic part, we pay for all the features 
whether we use them or not. Mixers and oscillators all add noise, 
whether they are all integrated inside one chip or spread out. System 
engineering is in order.

As Zach said, every decimation by 4 increases dynamic range by 
approximately 1 bit. So we should sample as fast as possible, and 
decimate in the digital domain. Except that ADC width and speed cost 
money, and the FGPA's ability to consume and process data is limited in 
speed. So we want to pick ADC width, clock speed and FPGA based on the 
knee of the price-performance curve. This "knee" is always moving, which 
is why today's hardware is better than yesterday's (well, usually).

My guess is that the practical, affordable ADC performance today is at 
~125Msps and either 14 or 16 bits. Interestingly, affordable FPGAs can 
handle 150MHZ to 200MHz at their inputs and outputs without too much 
problem. Go much faster, and you get really expensive really fast. The 
same with resources. I am most familiar with Altera/Intel FPGAs, so 
someone with Xilinx experience can add to this. The best cost per logic 
element seems to be either the MAX10M50 (50K LEs) or the Cyclone 5 E 
(logic only series). The C5E goes up to 300K LEs (which is a lot, and 
costs appropriately). If we want a dual-core ARM hard processor 
(per-configured in silicon, using zero LEs), the Cyclone 5 SE at 110K 
LEs is probably the best bet.

So where does this leave us for P4G hardware? Some kind of down 
conversion at the antenna is pretty obvious. Is it better to use an off 
the shelf LNB to a higher frequency (e.g., 900MHz) and then down-convert 
to base band at the SDR receiver front end? Or is it better to use a 
custom LNB and down convert directly to a base band frequency (say 
30MHz) that can be directly sampled?

A third option would be to build a VHF or UHF direct-sampling  SDR (at 
900MHz, to use the above example), but I think that would be too expensive.

73,
Scotty WA2DFI






More information about the Ground-Station mailing list