VGAtoIQBaseband
VGAtoIQBaseband Release status: stable [box doku] | |
---|---|
Description | use the VGA port to generate I/Q baseband signals |
Author(s) | siro (siro) |
Last Version | 0.1 () |
Platform | Linux |
License | mixed |
Download | [1] |
Description
This application reads from stdin and generates analog I/Q signals on the VGA connector that can be feed into a rf-modulator. This software is intended to be used with SDRT or SDRT2.
Requirements
- PC with VGA adapter
- OpenGL with Direct Redering
- Vsync enabled
- Software-defined Radio Transmitter that generates I/Q Signals
- GL_ARB_fragment_program to use the low-pass filter
- GL_ARB_pixel_buffer_object for efficient CPU to GPU transfers
What it does
Linux
tested with:
- Mesa 9.0.2
- fglrx 8.97.2
Windows
NOT SUPPORTED
- it is possible to add custom edids / modelines, but that's rather a hack using modified inf files
- other ways of adding modelines are unknown
Software
The current version runs on any symbolrate, recommended are 5MSymbols/s or more.
You need to set the DAC clock to 7*desired MSymbols per second,because low pixel clocks might not work and high pixel clocks can be scaled down by using multiple pixel per symbol (here 7).
Every symbol is a complex number, that contains an I (real) and Q (imag) value.
By default the signal is lowpass filtered using convolutional sinc function. To edit the impulse response edit the file fragment.glsl.
The I/Q values are feed using a unix fifo / a file. 8 byte form two 32bit floats, the first float maps to I, the second to Q.
The values have to be between 0.0f and 1.0f.
The real part (I) is assigned to the RED and the imag part (Q) to the GREEN channel, while the BLUE channel is set to 0.5f.
8bit and 10bit VGA Graphic cards are supported.
vgatobaseband uses the first VGA output found. If none is found it terminates.
The VGA output has to be configured first using xrandr, arandr, ....
The coordinates are saved, a new modeline is added and set. On termination the modeline is removed and the initial mode is restored.
No need to do any modesetting by hand !!!
All baseband signals contains positive as well as negative numbers. If your signal range covers -1.0f to 1.0f you have to convert them.
newval = (float2)(oldval.i/2 + 0.5f, oldval.q/2 + 0.5f)
The newval is now in range 0.0f to 1.0f and can be used with this programm.
Due to most GPU only having 8bit DACs the floats are converted into a range from -127 to 127 (using the blue channel as differential reference).
Arguments
./vgatoiqbaseband [ARGS]
reads data from stdin
ARGS could be:
-nofilter disable fragmentshader -v be verbose -verbose be verbose -display DISPLAY Specify the X server to connect to. If not specified, the value of the DISPLAY environment variable is used. -direct force direct rendering -t generates test patterns -d <x> convolutional depth default: 17 -f <c> same as --freq --freq <c> set convolutional filter cut-of frequency to c Mhz (c is float) default: 3.81 Mhz -h same as --help --help print help message
only if compiled with libXrandr support: -cutofright <n> cutof n rightmost pixels (hsync) default: 1 -cutofbottom <m> cutof m bottommost pixels (vsync) default: 2 -pclk <x> use a pixelclock of x Mhz default: 64
Examples
Set the VGA position first:
xrandr --output VGA-0 --right-of LVDS --auto
adjust VGA-0 and LVDS according to your system.
./vgatoiqbaseband /tmp/myfifo -v -msps 56 -cutofright 0 -cutofbottom 0
This will read from the file /tmp/myfifo, pixelclk is 56Mhz, it will output 56/7 = 8 MSPS, set HSYNC to 0 and VSYNC to 0.
0 VSYNC might only work on Intel GPU.
./vgatoiqbaseband -v -t
This will generate test paterns, pixelclk is 64Mhz, it will output 64/7 = 9.142 MSPS, set HSYNC to 1 (7 pixels) and VSYNC to 3 lines.
Might work on all GPUs.
This also removes the right-most data value in each line and cuts off the 3 bottom-most lines from the input data to prevent phase shifts.
Libraries
- libglut
- libglu
- libgl
You may also use
- libxrandr
- libX11
Debian packages:
- freeglut3-dev
- libglu1-mesa-dev
- libgl1-mesa-dev
- libxrandr-dev
OFDM Bandwidth
OFDM useful carriers to total carriers ratio: 1705 / 2048 = 0.832
DAC clock [Mhz] | Msymbols/s | carrier ratio | 3dB Bandwidth Mhz |
---|---|---|---|
64 | 9.14285 | 0.832 | 7.61 |
56 | 8.000 | 0.832 | 6.66 |
48 | 6.8571 | 0.832 | 5.71 |
Phase shift
As phase-shifting OFDM symbols results in malformed spektrum, the horizontal sync is padded with blanking pixel to make sure the sync has the same size as all other symbols.
Interpolation Filter
The convolutional filter is critical in this application. Using OFDM 8 Mhz bandwidth the I and Q channel maximum frequency is 3.81 Mhz. The DAC sampling rate is 9.142 MSPS which is enough to reconstruct all frequencies, but due to the low sampling rate the baseband signal contains aliasing artefakts.
To remove those artefakts a si-filter is neccessary, including an interpolation DAC, which is running at much higher sampling rate.
This is done by the fragment shader, implemented in shader.cpp.
The GPU has to fetch,multiply, add and store at least <sampling rate> * <conv_depth> pixels
DAC clock [MSps] | GPU | Driver | max Kernel_size |
---|---|---|---|
64 | RV710 | Mesa 9.0.2 | 7 |
64 | RV710 | fglrx 8.97.2 | 17 |
recommended | 27 or more |
A hardware low-pass filter is required to remove remaining aliasing effects.
Screen size
On fglrx the default screen size is 1600x1600. To increase this limit generate an xorg.conf using 'aticonfig --initial' and add this line:
[...] Section "Screen" Identifier "aticonfig-Screen[0]-0" Device "aticonfig-Device[0]-0" Monitor "aticonfig-Monitor[0]-0" DefaultDepth 24 SubSection "Display" Viewport 0 0 Virtual 3600 1600 Depth 24 EndSubSection EndSection
The line containing "Virtual" is critical. This should be no problem on Mesa as the default screen size is 8196x8196.
TODO
- Windows modesetting support
- You may use a custom edid, see here for more details HackingVGAforFun.