Simulating bad LTE signal

I am testing my IoT device equipped with Digi XBee3 LTE-M/NB-IoT modem. My device currently connects to the AT&T CAT-M1 network, and I am getting RSRP around -100dBm and RSRQ around -13dB on Band 12 (700MHz) in my area.

I am trying to “simulate” bad signal condition to make sure the device is capable of recovering from IP network connection loss. If I add a RF signal attenuator to the antenna, I can make the RSRP drop below -120dBm and still maintain fairly good connection. I am wondering if there is a way to simulate bad RSRQ, i.e., to introduce noise to degrade the signal quality. I tried to wrap aluminum foil around the antenna, which only seemed to reduce RSRP, pretty much the same as adding a signal attenuator.

Ideally I’d like to simulate a condition where the LTE signal comes and goes. I don’t have a callbox.

Any suggestions? Thanks.

I would suggest a variable attenuator. This way you can adjust the value causing a disconnect and allowing a re-connection.

mvut - Thanks for the response.

It is actually quite interesting to play with RF attenuators. I notice that if I drop the RSRP below, for example, -110dBm, the modem would have hard time registering with the cell network. However, once it is registered, the RSRP can go further down, which will cause the data connection getting lost frequently, while the cellular connection is still maintained.