Hi @Karine,
You've raised a really interesting point here, and I think your observation about the missing sqrt(12/128) factor deserves a careful look. I did some digging through the MathWorks documentation and other LTE examples to see if this is a genuine issue or if there's something subtle going on with how SNR is defined.
First off, your reasoning about the power normalization makes sense from an OFDM theory perspective. When you add noise in the time domain, that noise gets spread across all FFT bins after demodulation, but your signal only occupies the used subcarriers. For NB-IoT with 12 active subcarriers out of 128 total FFT bins, you'd expect some scaling factor related to this ratio. The sqrt(12/128) factor you mentioned would account for this, and in dB terms that's about a 10 dB difference, which is definitely significant enough to notice when you measure the actual SNR.
Here's what I found that supports your concern. The WLAN Toolbox actually handles this explicitly. In their documentation for the convertSNR function ( https://www.mathworks.com/help/wlan/ref/convertsnr.html ), they have parameters specifically for FFTLength and NumActiveSubcarriers, and they show code like "SNRdB = SNRSC - 10*log10(ofdmInfo.FFTLength/ofdmInfo.NumTones)" to account for energy in unused subcarriers. This suggests that MathWorks is aware of this issue in at least one of their wireless toolboxes and handles it there.
However, here's where it gets interesting. When I looked at other LTE examples like the PDSCH Throughput for TM7-10 ( https://www.mathworks.com/help/lte/ug/pdsch-throughput-for-non-codebook-based-precoding-schemes-port-5-tm7-port-7-or-8-or-port-7-8-tm8-port-7-14-tm9-and-tm10.html ), I found they all use the exact same noise normalization formula as the NPDSCH example: N0 = 1/(sqrt(2.0*ntxants*double(ofdmInfo.Nfft))*SNR). None of them include a factor for used versus total subcarriers. This pattern appears consistent across all the LTE Toolbox examples I could find.
So we've got two possibilities here. Either there's a systematic issue across all LTE examples where they're missing this normalization factor, or the LTE Toolbox is using a different SNR definition than the WLAN Toolbox. The key question is what "SNR" actually means in these examples. If it's SNR per subcarrier (like WLAN seems to define it), then you'd need that additional factor. But if it's SNR per time-domain sample, then maybe the IFFT normalization already handles the scaling and no additional factor is needed.
What makes this tricky is that I can't tell from the documentation exactly how lteOFDMModulate and lteSCFDMAModulate scale their outputs internally. MATLAB's ifft function uses 1/N normalization by default, and there might be additional scaling happening inside these functions that we're not seeing. The fact that ALL the LTE examples use the same formula makes me think it might be intentional rather than a bug, but the WLAN Toolbox evidence suggests otherwise.
I think the best way forward would be to run a controlled test. You could create a simple OFDM signal with known power, add noise with a specific variance, and then measure the actual SNR after demodulation to see if it matches what you'd expect. Compare the measured SNR with both formulas (with and without the sqrt(12/128) factor) and see which one gives you the correct result. That would give you definitive proof one way or the other.
Given the significance of a potential 10 dB error and the fact that this affects an official MathWorks example that people rely on for system design, I'd strongly recommend contacting MathWorks technical support directly with your findings. They have access to the internal implementation details of the OFDM modulation functions that we can't see from the outside, and they can definitively tell you whether the current formula is correct or if there's a bug that needs fixing. If it turns out there is an issue, they can also push a fix to the example and update the documentation so others don't run into the same problem. You've done solid analysis here and this deserves an official response from the toolbox developers. Make sure to reference the WLAN Toolbox's different approach in your support request since that's strong evidence that this normalization issue is on their radar in at least one toolbox.
The references I found most useful were the WLAN convertSNR documentation showing explicit subcarrier handling, and the PDSCH TM7-10 example showing the consistent pattern across LTE examples. Your observation is well-founded and deserves careful investigation either way.