In this case I wanted to see what happened if the SAS system as a whole was used to try and obtain measurements but without any "proper" connections or setting up - just a pick up wire.
Neither the pickup wire nor the RSP and its USB connection were moved between tests. I keyed the TX and took a signal level reading off the screen using the cursor, with so far as I'm aware everything in the software at default. The big differences compared to your results, since you have confirmed you are using the calibration tables mentioned by SDRPlay Support, indicate something is probably awry in the measurement setup?
My PC and TX are strapped together to a common ground, but fairly close physically so it's conceivable there's unstable pickup going on with just a bit of wire, particularly at these low signal levels.
I'll re-run my tests using a different dummy load that has an attenuated tap-off and consequent direct cable connection to the RSP. Hopefully that will be better, but I'll have to check the tap output levels are suitable first as it's normally used with a 'scope. I'll also repeat the pickup wire tests using battery power on a laptop, so the RSP is entirely separated from the rig.
If this then shows marked improvement in variations then maybe some guidance on setting up to take measurements would seem to be useful?
I'm not prepared to name the other software and hardware I tested on a public forum, particularly given the wide variation between SDRUno and SAS that indicates something else is probably at play here.
Link to forum post re the calibrated analyser - viewtopic.php?f=12&t=3323#p10608 and then the blog referenced within, but my mistake in arithmetic - it's 10.7dB not 12dB difference, my apologies!
Reason: No reason
Likewise here - I know the feeling wellalantlk wrote:Hi, sorry for slight delay in reply but work has got in the way a bit.
I've looked at the screenshots John Harper (AE5X) posted on his blog. I see that he has an input signal at around 0dBm. This will almost certainly lead to an overload condition, the Overload warning is shown on the analyser. It's probably best to try and keep the signal no higher than -30dBm. I also see that he has AGC enabled, and has no gain reduction set via the LNA state control. This control works in the same way as it does with SDRuno and reduces the front-end gain. He has the gain reduction set to zero.I suggest that you turn AGC off and use the AGC and LNA controls to set the gain manually. Setting the number of FFT bins to 32768 or higher will also result in a more stable display in terms of signal amplitude.
The following screen shots were taken with a signal input of 3.5MHz at a level of -30dBm. The overload warning has been triggered, please ignore that, It does not always get cleared once the overload condition is removed - one more for the bug list. I get the same signal at 7MHz as John. It's not produced by my signal generator (tested using a Racal RA6790 receiver) So it looks as if it's produced somewhere in the RSP2. I haven't had time to look into that yet. The third shot was taken as a control image and shows the analyser display with the signal generator output disabled.
The second shot shows the same -30dBm level displayed using SDRuno, note the close correlation in displayed signal levels.
Reason: No reason
This screenshot is the one that John Harper posted. I think there may be some confusion regarding the Ref-dBm control in the Display control panel. On a normal analyser, this control would set the appropriate gain or attenuation. Currently the control is just used to set the displayed dBm level at the top of the display and has no effect on gain. This was not made clear in the preliminary instruction manual. The next release will operate in the same manner as a normal analyser with the appropriate gain reductions being set according to the reference level selected. The AGC system will be removed entirely.
As mentioned in my previous post, I'm not sure what the signal at 7MHz is, it may not be the harmonic John was looking for as I get the same signal both with the analyser and SDRuno. Note the similarity in signals between his test, and my test with a signal generator.
I hope this is of some help. Let us know how your further testing goes, and thank you for your comments and observations.
Reason: No reason
Main PC using the pick-up as before (my dummy load tap output is too high), but moved it and load closer to try and help reduce any extraneous pickup:
SAS - peak -89dBm. Settings - Average 16, NFFT 32768 & Blackman window, IF Gain reduction 40, AGC Off
UNO - peak -83.8dBm. Settings - same as SAS, but see below
At some 5dBm apart that's way better than before.
In SDRUno I found the IF AGC has to be turned off in the main settings as it seems to be that one which had the major effect in keeping the readings closer. Also used Low IF as that seemed to get it all closer still. However, these readings were not entirely repeatable and seemed to vary in differential, at times they were 10dBm apart.
Change to laptop under battery power, so no common ground connection between source (rig) and laptop other than the USB lead from spares drawer (no ferrite beads on the lead), pickup not moved:
SAS - peak -99dBm
Uno - peak -120dBm
Now some 20dB apart. (I'm not sure about this one, both differences to the other readings seem too large, but I did repeat it a couple of times)
That further indicates the pickup mechanism is probably unstable.
Changed the pickup to my permanent SDR connection on the main rig and computer, where there is some leakage on transmit, but this is an entirely cable connected scenario with common ground. Same basic settings as above, measured the transmit leakage peak:
SAS - -86dBm
Uno - -88.2dBm
Close and similar to your measurements on a signal generator.
Change to the laptop on battery power, spares drawer USB lead (no ferrites), but no change on RF side:
10dBm apart again!
Change the laptop spares drawer USB lead to the one from the main PC (ferrites both ends), but no change on RF side:
Now 12dBm apart...
How the units are connected seemed to play a big part in the absolute measurements seen, as well as the differentials between SAS and UNO, and as you might expect the better the ground connection between source and measurement in the cabled scenario the better the correlation between the two software titles. However, by setting an input offset of 15dBm or by turning on the AGC and setting the LNA to 7 in SAS I can get both SAS and Uno within 1dBm of each other.
I did notice something peculiar though, with both laptop and main PC instances of SAS - on applying the transmit signal the noise floor of all signals dropped by 20dBm - see screenshots. The red is what it was before transmit. On Uno the drop was a bit less noticeable at about 10dBm+ Note these are from yet another pickup arrangement, albeit still using my permanent SDR connection. Hope that helps a bit, but what's exactly going on here except that RF grounding and capacitance is involved somewhere, is clear as mud to me!
- Main PC on TX
- main PC on Tx.png (16.43 KiB) Viewed 4952 times
Reason: No reason
There is a problem with the analyser relating to location settings. Any location/locale that uses the ',' as a decimal separator will cause the application to crash when it attempts to load. This will be rectified in the next release. I suggest you try setting your location to something like USA, UK, or any other country that uses the '.' as a decimal separator and see if this will let you launch the analyser. Could you let me know what your current location setting is, and whether changing the location setting works for you.mictor wrote:Hello
Could not open Spectrum Analysis 0.9a
Windows XP closes the program
Please note that this is an alpha release, and as such, has a few other problems. Before reporting any problems you may come across, please read the forum posts as you may find that the problem has already been reported, and that a workaround solution has been posted.
Reason: No reason
The input offset has no effect on the calculation of absolute dBm levels. The offset control just adds(or subtracts) an offset value to the calculated dBm level prior to displaying it, and is included as an aid to making attenuation or loss measurements. For example: measuring attenuation using a noise generator. Remove the DUT, and use the offset control to set the displayed signal level at 0dBm. Then connect the DUT and attenuation (or gain) levels can be read directly without having to use the cursors, or a calculator. The dBm trim control works in a similar manner, but allows a finer control, and is primarily included as a fine level calibration control, and can be used as a correction if for instance you know your signal generator output is say 1.7dBm below what it should be. Internally, the calculation done is simply: FFT_dBm - SystemGain_dBm + Trim_dBm + InputOffset_dBm. In the next release the Trim_dBm value will be stored as a dBm calibration level. The use of the trim and offset controls are covered in the manual, but perhaps are not as clearly explained as they could be.alantlk wrote:How the units are connected seemed to play a big part in the absolute measurements seen, as well as the differentials between SAS and UNO, and as you might expect the better the ground connection between source and measurement in the cabled scenario the better the correlation between the two software titles. However, by setting an input offset of 15dBm or by turning on the AGC and setting the LNA to 7 in SAS I can get both SAS and Uno within 1dBm of each other.
Could you let me know which RSP model you are using for your tests ? - I'll see if I can try and duplicate your results.
The lack of consistency between your measurements is a puzzle. As regards the overall level dropping when you key your transmitter: I would say it is essential that you do not have AGC enabled. Also, using low IF with Uno may give different results to the analyser due to the difference in detected bandwidth. The analyser currently only operates in zero-IF mode, this has been changed for the next release.
I hope that above makes sense. I was up to around 4AM local time watching the world cup, sleep deprivation is not conducive to clear thinking
Reason: No reason
My SDR connection uses Port B of my RSP2Pro so is presumably measuring between the signal input and it's ground connection, and if there's a parasitic capacitance or imbalance it will see something different to what is anticipated?
When it's a well ground bonded circuit, like a signal generator, or my permanent SDR feed, those parasitics or imbalances either don't exist or are swamped by the ground bonding.
The changes I saw with the ground floating laptop, particularly with and without the ferrite beads on the USB leads, demonstrate that quite well and of course a single pickup wire is also completely unbalanced. Using 5W, my rig minimum, will also give more RF in the system so any unstable voltage nodes will be larger. I guess RSP ground probably wasn't anything of the sort at -80dBm!
In short the RSP2Pro "ground" connection was probably "RF hot" in some of my measurement circumstances, albeit at microvolt levels.
To try and prove that to myself today I made a small 4 turn pickup coil and connected it to the balanced P and N connectors on the RSP HiZ port with a nearby dummy load fitted onto the coax from the transmitter, so eliminating "ground" from the measurements. Uno and SAS noise floor readings were different (SAS gain was at 40, Uno IF gain manual at 40) so I roughly balanced those to the same level using the UNO gain (attenuation) control, all AGC settings off on both. On transmit the measured peak on each was then within 5dBm of each other, Uno at -50dBM and SAS at -55dBm. I could have got closer, I suspect, by finer balancing of the noise floor gain.
Alternatively one could maybe create a virtual earth at the RSP ground with a tuned counterpoise?
However while the above may explain the differences in levels between measurement setups I don't understand why SAS and Uno measure differently in each setup? I assume those measurements are done a different way in each software?
Since SAS is perhaps more likely to be used for measurements I hope this exchange has been useful to show it might not be an entirely straightforward process? Some guidance on setting up for those measurements looks to be needed?
As for changes in noise floor on transmit it occurred to me today that the SDR interface box might be shorting or adding a resistor across the line to the SDR, so that probably explains the reductions in overall signal level seen yesterday on keying the TX - sorry I didn't realise that then but it was getting late... However with the balanced pickup coil noted above both SAS and Uno showed a 10dBm *rise* in that noise floor level when the Tx was keyed. Why that might be I don't have the faintest idea!
Reason: No reason
- I felt like it (v0.9) is missing the regular Zone, Peak Search, and Normal/Delta marker operations, even if with a fixed zone with of, say, 0.5 divsion. These are probably the features I most use when I'm using an analyser. I find the current cursor functions less intuitive. Also, the info in the Cursors dialogue is repeated on the display. Which leads to the next comment..
- The centre frequency, sweep width and cursor info takes up quite a bit of screen real estate so I'm wondering if that info could be simplified, possibly using colours for the cursor / marker position and level, and moving centre frequency and sweep width (span?) to the bottom of the screen?
I'm especially interested in having the ability to open out the span to 1GHz (by processing 5MHz blocks, as you mentioned in an earlier post), even if it's a bit slow, so I'm hoping this is high up on your 'planned feature list' :=)
Do you have any timescales for future releases/feature inclusions, or is it very much a 'watch this space' project ?
Your labours are appreciated.
Reason: No reason
The reason for releasing the analyser as an alpha release was to get feedback such as this, and your comments are much appreciated.PeteW wrote:I felt like it (v0.9) is missing the regular Zone, Peak Search, and Normal/Delta marker operations, even if with a fixed zone with of, say, 0.5 divsion.
I take your point about the data display taking up display real estate, and that is one of the features that is being addressed. The analyser is currently undergoing a fairly wide-ranging re-write, which includes a lot of changes and additions to the cursor system. Wide-band sweeps up to 2GHz are also being looked into. The current version was able to carry out wide-band sweeps but this facility was disabled due to various problems, the time taken to do such a wide sweep being one of them. There is not much scope for increasing the scanning speed for wide sweeps, but a progressive display update is being considered which should improve matters.
As to a timeline for the next release, I'm afraid it really is a case of watch this space. I can only say that if necessary, I'll release another alpha version to get further feedback before committing to a final beta version.
Reason: No reason