PDA

View Full Version : LTE - RF power per subcarrier?



RFBLOKE
2011-08-17, 05:21 PM
I am trying to determine a value for the RF power level of each sub-carrier that is transmitted from an LTE base station. My first attempt at this is below:-

Based on a 10 MHz channel......

Within the 10 MHz channel there are 50 resource blocks, each of these blocks has 12 sub-carriers (15 kHz spacing). We therefore have 50x12 = 600 sub-carriers in the 10 MHz channel. Taking the maximum base-station transmitter power as 43 dBm (20W), each sub-carrier therefore has 10.log(20W/600) = 15.22 dBm power level.

Is this correct?

gprastomo
2011-08-17, 06:04 PM
Hi,

WIth 10 MHz, you have max 50 Resource Block with 12 subs each RB, and 15 khz spacing.

So with 43 dBm of Total Power, you will have 43 - 10log(50*12) = 15.22 dBm for Reference Signal Power at the amplifier end.

So this 15.22 dBm is not per subcarrier power.

Then let say, we have -80 dBm of RSSI, so the RSRP will be -80 - 10log600 = -107 dbm for RSRP

RFBLOKE
2011-08-17, 06:24 PM
I am struggling to understand this.

It is my understanding that the base station can transmit a maximum power of 20 Watts in any channel, or putting this another way, a maximum of 20 Watts is the RF power that can be devoted to communicate with any single user. However, because the LTE system uses OFMD the 20 watts of power is split up between a number of sub carriers instead of being devoted to one carrier. What I am trying to determine is the actual RF power level of each of the sub carriers. It would be good to know what the largest power level for a single sub-carrier could be and the typical level. Putting this another way, what power level would I see if I were to connect a narrowband receiver (15 kHz wide) to the output of the LTE base station (set up with one sub carrier dead centre of the narrowband receiver).

The reason I am trying to determine this information is to estimate the potential effect of an LTE base station on an item of narrowband receiver kit I am currently working on.

gprastomo
2011-08-17, 07:08 PM
Hi,

LTE uses OFDM for Modulation, and the data+signalling symbol will be carried over each subcarriers. And Basically roughly speaking, it is divided into 2 big part, first for Resource Block and second for Reference Signals.

So to make it simple, then i ll do like this :
1. with 100% loading, 50 RBs with 12 subcarriers each.
2. 43 dBm PA

Let say we dont have link loss (=0 dB)
so we will have 15.52 dBm for RSRP, you can use this as Power Level for Reference SIgnal.
So the Data symbol will be 20000mW - 35.645 mW = 19.916 mW.
This 19.916 mW consist of 600 subcarriers, so power each subcarrier in RB 19.916/600 = 33.19 mW = 15.2107 dBm.

Actually it almost the same value as RSRP but it has different concept of derivation.

RFBLOKE
2011-08-17, 07:42 PM
Where did the 15.52 dBm come from? did you mean 15.22 dBm?

gprastomo
2011-08-17, 09:10 PM
Yes, 15.22, sorry for misstyping

RFBLOKE
2011-08-17, 09:42 PM
OK.

So far it looks like 15.22 dBm is a good first guess at the RF power level for a single sub-carrier, not too scary. However, I believe our sums thus far are based on the spec limits for the base station power amplifier output (20 watts) which isn't going to be whats actually transmitted over the air. From what I can see from the little literature knocking around on the subject, the EIRP doesn't have any upper limit at all - surprising! Anyway, taking the typical antenna gain figures that seem to be suggested for LTE (16 dBi) we are probably looking at an EIRP of around 43 dBm + 16 dB = 59 dBm. The sub-carrier power is going to therefore be 31.22 dBm.

gprastomo
2011-08-17, 09:55 PM
Yes..
You need to consider the other loss since this 15.22 dbm is located at amplifier termination. For EIRP, you need to consider the connector loss + Feeder loss or may be optical loss if you use optical interface. Then you ll get the EIRP from antenna end.

IF you want to calculate at the user termination, you need to add propagation model loss which around 120 - 130 db.

mrs2mrs
2012-01-09, 07:21 PM
Hi guys,
I'm new at this and I want to make some things clear - let's assume that for the case above EIRP is 59 dBm. 59 - 15.22 = 43.78 dBm = 23.88 W; 23.88 / 600 = 0,0398 W = 16 dBm - and this is power per single subcarier on antenna end. So now to calculate power on UE end I have to substract about 120 dB for path loss. And what I get is -104 dBm, which is ca. 4 mW on user side. Is my way of thinking right ? Isn't 4 mW a little bit low value ?
I would really appreciate quick reply. Thank you in advance!

pyrague2010
2012-02-27, 09:18 AM
it's correct : each sub-carrier therefore has 10.log(20W/600) = 15.22 dBm power level this power - Maximum Path loss(ant gain - losses ) = RSRP minimum for each clutter morfology(DU-UR-SU)

If MIMO is used TX power for each paths is calculated

firstmaxim
2012-02-28, 03:44 AM
But is it required of us to estimate the power of an individual carrier, when the granularity of allocation by the packet scheduler is a resource block containing 12 subcarriers?

firstmaxim
2012-02-28, 03:52 AM
Hi,

WIth 10 MHz, you have max 50 Resource Block with 12 subs each RB, and 15 khz spacing.

So with 43 dBm of Total Power, you will have 43 - 10log(50*12) = 15.22 dBm for Reference Signal Power at the amplifier end.

So this 15.22 dBm is not per subcarrier power.

Then let say, we have -80 dBm of RSSI, so the RSRP will be -80 - 10log600 = -107 dbm for RSRP


The reference signal are are already interpolated in the 50 resource blocks that you have highlighted. So, you cannot apportion the 15.22 dBm to the Reference Signal power.