Uploaded by sergio *******

Potenza LTE 5G 2

advertisement
Potenza
Ogni cella in una rete radio LTE invia un reference signal della cella (RS) dalle sue antenne di trasmissione. La
potenza di trasmissione di un resource element (RE) che trasporta tale segnale di riferimento può essere
impostata per essere uguale, maggiore o inferiore alla potenza di trasmissione di un RE che trasporta un
canale condiviso di downlink fisico (PDSCH). Diamo una rapida occhiata al potenziamento della potenza del
segnale di riferimento, dove l'RS RE utilizza più potenza del PDSCH RE. L'aumento di potenza RS può essere
desiderabile o meno dal punto di vista delle prestazioni RF.
I relativi livelli di potenza di trasmissione di RS e PDSCH hanno implicazioni sulla stima del canale di downlink,
sulla quantità di interferenze di downlink e sull'interpretazione e sull'uso dell'indicatore di qualità del canale
(CQI) da parte dell'eNodeB. Ad esempio, se la potenza RS viene aumentata, la UE potrebbe potenzialmente
effettuare le misurazioni RS (ad esempio, RSRP e RSRQ) più facilmente e potenzialmente quantificare le
condizioni del canale downlink in modo più affidabile. Tuttavia, l'interferenza complessiva sulla RS RE per una
determinata cella aumenterebbe a causa del fatto che più celle vicine trasmettono più potenza sulle proprie
RS RE. Se il rapporto segnale-interferenza-più-rumore (SINR) stimato per il PDSCH si degrada in modo
significativo, il CQI riportato dall'UE sarebbe inferiore. Se i CQI riportati sono relativamente inferiori, l'eNodeB
mirerebbe a un throughput target inferiore intraprendendo azioni come l'aumento della quantità di codifica
Turbo nelle trasmissioni PDSCH. La produttività percepita dall'utente potrebbe quindi essere leggermente
inferiore quando la potenza RS viene aumentata. Tuttavia, se la migliore stima del canale e la maggiore
affidabilità della ricezione PDSCH portassero a un minor numero di ritrasmissioni HARQ, il throughput
potrebbe effettivamente aumentare in caso di aumento di potenza RS. In sintesi, l’impatto teorico
dell’aumento di potenza RS sulle prestazioni RF non è definitivo. Si consiglia di effettuare prove sul campo
con diversi livelli di aumento di potenza RS e diversi livelli di carico del traffico per determinare l'idoneità
dell'aumento di potenza RS.
L'eNodeB trasmette i livelli di potenza di trasmissione di RS e PDSCH in SIB 2 utilizzando i parametri di
riferimento Signal Power, PA e PB. La potenza di trasmissione di una RE che trasporta la RS (in dBm) è
specificata come reference Signal Power. PA influenza un parametro chiamato ρA, che è il rapporto tra la
potenza di trasmissione del PDSCH RE e la potenza di trasmissione dell'RS RE. ρA è applicabile ai simboli
OFDM che non portano RS. PB stabilisce la relazione tra ρA e ρB, dove ρB è il rapporto tra la potenza di
trasmissione del PDSCH RE e la potenza di trasmissione dell'RS RE nei simboli OFDM che trasportano RS. PA
varia da 0 a 7 e corrisponde all'intervallo da -6 dB a +3 dB per ρA. PB varia da 0 a 3 e corrisponde all'intervallo
da 5/4 a 1/2 per (ρB /ρA)[1].
Facciamo due esempi numerici. Supponiamo che un amplificatore di potenza da 30 W venga utilizzato per
un'antenna di trasmissione di un eNodoB e che in una cella venga distribuita una larghezza di banda di
downlink di 10 MHz. La potenza di trasmissione nominale per sottoportante è (30 W/600)= 50 mW. Durante
un simbolo OFDM in cui non è presente RS, a ciascuna sottoportante del PDSCH vengono assegnati 50 mW.
Esempio 1: PA =2 e PB =1 (con aumento di potenza di 3 dB per RS)
PA =2 implica ρA = 0,5 o -3 dB. Quindi, (accensione PDSCH RE/accensione RS RE)= 0,5. Poiché la potenza di
trasmissione assegnata a PDSCH RE è il livello di potenza nominale di 50 mW, la potenza di trasmissione
assegnata a RS RE è (Accensione su PDSCH RE/0,5)= (50 mW/0,5)= 100 mW. Durante un simbolo OFDM che
trasporta la RS, il numero di RE che trasportano la RS da un'antenna di trasmissione è (50 blocchi di risorse
fisiche * 2 RE/blocco di risorse fisiche) = 100 RE. Inoltre, una data antenna non trasmette alcuna potenza su
un insieme di 100 RE, perché tale insieme è utilizzato da un'antenna trasmittente diversa. Pertanto, su 600
RE in un simbolo OFDM che porta la RS, 100 RE sono soggette ad aumento di potenza RS, 100 RE non hanno
potenza di trasmissione e i restanti 400 RE hanno livelli di potenza nominale. La potenza di trasmissione totale
durante il simbolo OFDM che trasporta RS sarebbe (100 sottoportanti * 100 mW per sottoportante per RS
RE potenziate) + (100 sottoportanti * 0 mW per RE null) + (400 sottoportanti * 50 mW per sottoportante per
1
non -RS RE)= 30 W. Pertanto, quando PA = 2 e PB = 1, a ciascuna RS RE vengono allocati 100 mW, mentre a
una RE non RS (in qualsiasi simbolo OFDM) vengono assegnati 0 mW o 50 mW. referenceSignalPower è
impostato su 10*log10(100 mW)= 20 dBm.
Esempio 2: PA =4 e PB =1 (senza aumento di potenza per RS)
PA =4 implica ρA = 1 o 0 dB. Pertanto, (Alimentazione su PDSCH RE/Alimentazione su RS RE)= 1. Poiché la
potenza di trasmissione assegnata a PDSCH RE è il livello di potenza nominale di 50 mW, la potenza di
trasmissione assegnata a RS RE è (Alimentazione su PDSCH RE/1)= (50 mW/1)= 50 mW. Pertanto, su 600 RE
in un simbolo OFDM che porta la RS, 100 RE sono soggette al livello di potenza RS, 100 RE non hanno potenza
di trasmissione e i restanti 400 RE hanno livelli di potenza nominale. La potenza di trasmissione totale durante
il simbolo OFDM che trasporta RS sarebbe (100 sottoportanti * 50 mW per sottoportante per RS RE non
potenziate) + (100 sottoportanti * 0 mW per RE null) + (400 sottoportanti * 50 mW per sottoportante per RE
non RS) = 25 W. Pertanto, quando PA = 4 e PB = 1, a ciascuna RS RE vengono allocati 50 mW, mentre a una
RE non RS (in qualsiasi simbolo OFDM) vengono assegnati 0 mW o 50 mW. referenceSignalPower sarà
impostato su 10*log10(50 mW)= 17 dBm.
RSRP sta per Potenza ricevuta del segnale di riferimento. È una misura della potenza del segnale dei segnali
di riferimento utilizzati dalle reti LTE (Long Term Evolution). I segnali di riferimento vengono utilizzati dalla
rete per stimare la potenza e la qualità del segnale sull'UE (apparecchiatura utente) o sul dispositivo mobile.
L'RSRP è misurato in decibel-milliwatt (dBm). Un valore RSRP più basso indica un segnale più forte. L'intervallo
tipico dei valori RSRP è compreso tra -140 dBm e -44 dBm. Un valore di -140 dBm indica un segnale molto
debole, mentre un valore di -44 dBm indica un segnale molto forte.
2
RSRP viene utilizzato dalla rete per determinare a quale cella deve essere connessa l'UE. L'UE si connetterà
alla cella con il valore RSRP più elevato. L'RSRP viene utilizzato anche per calcolare il SINR (rapporto segnaleinterferenza più rumore), che è una misura della qualità del segnale.
Ecco alcuni dei vantaggi derivanti dall'utilizzo del RSRP:
Può essere utilizzato per determinare a quale cella deve essere connessa l'UE.
Può essere utilizzato per calcolare il SINR, che è una misura della qualità del segnale.
Può essere utilizzato per monitorare la potenza del segnale e la qualità della rete.
Può essere utilizzato per risolvere i problemi con la rete.
RSRP è una metrica importante per le reti LTE. Viene utilizzato per garantire che gli UE siano collegati al
segnale più forte e che la qualità del segnale venga mantenuta.
Il segnale di riferimento DL (DL RS) in LTE svolge una funzione simile a UL RS ma per la direzione del downlink.
Consente alla rete di misurare le condizioni del canale di uplink presso la stazione base, consentendo una
migliore pianificazione e gestione delle interferenze. Aiuta inoltre a garantire che l'UE riceva segnali accurati
con distorsione o rumore minimi, che influiscono sulla velocità di trasmissione dei dati e sulle prestazioni
complessive. Inoltre, consente all'eNB di regolare la propria potenza di trasmissione in base al feedback
dell'antenna dell'UE, ottimizzando la comunicazione tra entrambe le direzioni.
I segnali di riferimento sono tipicamente generati dall'eNodeB (stazione base) utilizzando un algoritmo noto
o una sequenza standardizzata. L'UL RS viene trasmesso nel canale di controllo fisico uplink (PUCCH), mentre
il DL RS viene inviato nel canale di controllo fisico downlink (PDCCH). Questi canali sono dedicati a scopi di
segnalazione e trasportano informazioni sulla pianificazione, l'allocazione delle risorse e altri parametri
relativi ai dettagli della trasmissione dei dati. La rete utilizza questi segnali per misurare la qualità del
collegamento wireless tra l'antenna dell'UE e le proprie antenne, regolare di conseguenza i livelli di potenza
e ottimizzare le prestazioni complessive.
Esistono molti modi diversi per la stima del canale, ma i concetti fondamentali sono simili. Il processo viene
eseguito come segue.
i) impostare un modello matematico per correlare il "segnale trasmesso" e il "segnale ricevuto" utilizzando
la matrice "canale".
ii) Trasmettere un segnale noto (normalmente lo chiamiamo "reference signal" o "segnale pilota") e rilevare
il segnale ricevuto.
iii) Confrontando il segnale trasmesso e il segnale ricevuto, possiamo capire ciascun elemento della matrice
del canale.
Come esempio di questo processo, descriverò brevemente come questo processo in LTE. Naturalmente, non
posso scrivere tutti i dettagli di questo processo in LTE e molti dettagli dipendono dall'implementazione (il
che significa che l'algoritmo dettagliato può variare con ogni specifica implementazione del chipset). Tuttavia,
il concetto generale sarebbe simile.
3
Classificazione e ruolo CSI-RS
Informazioni sullo stato del canale: le informazioni sullo stato del canale del segnale di riferimento sono
molto importanti per migliorare le prestazioni generali del sistema wireless. Dopo aver ottenuto le
informazioni CSI, la stazione base può programmare MCS in base alla qualità del canale, all'allocazione delle
risorse RB, all'assegnazione dei raggi per migliorare la velocità, al multiplexing multiutente MU MIMO, ecc.
In LTE, a causa dell'esistenza del CRS (fino a 4 porte antenna), quando il numero di strati multiplexing della
divisione aerea non supera 4, l'UE può misurare il CRS e segnalare il CSI. La versione LTE R10 ha introdotto il
concetto di CSI-RS, che può supportare più di 4 livelli di multiplexing della divisione aerea e più di 4 porte per
antenna feedback sullo stato del canale, mentre CSI-RS in LTE supporta anche CSI-IM, ZP-CSI-RS, ecc. In NR,
poiché non esiste CRS naturalmente, CSI-RS è necessario per più canali di porta dell'antenna (fino a 32)
feedback di stato e monitoraggio del dominio tempo-frequenza (TRS). Rispetto a CRS, CSI-RS in NR ha meno
sovraccarico e supporta un numero maggiore di porte antenna.
Il processo complessivo di segnalazione del CSI da parte dell'UE in NR include: la stazione base invia il segnale
di riferimento CSI-RS in base alla configurazione; l'UE misura il CSI-RS (compresa la misurazione del canale, la
misurazione dell'interferenza) e l'UE riporta i risultati del CSI; la stazione base esegue l'elaborazione relativa
alla pianificazione in base ai risultati CSI riportati
Misurazione del canale
4
Le misurazioni della qualità del canale sono suddivise in normali misurazioni CSI (potenza diversa da zero,
reporting UE incluso CQI, PMI, RI, LI) e misurazioni CSI di IM (misurazioni di interferenza, utilizzate per
misurare la quantità di interferenza nel SINR).
Misurazione CSI generale (NZP-CSI-RS)
Contenuto codificato dal segnale NZP CSI.
Rapporto CSI sul contenuto, ovvero il contenuto effettivo trasmesso su RE, incluso
CRI: Indicatore di risorse CSI-RS.
UE indica il miglior indicatore di risorse CSI-RS, corrispondente al miglior raggio.
lRI: Indicatore di rango.
Similmente all'LTE, l'UE ottiene il rango di trasmissione del downlink, che è il numero del livello di
trasmissione corrispondente. Per un codeword singolo, il numero di RI è 1-4, mentre per un codeword
doppio, il numero di RI è 1-8.
PMI: indicatore della matrice di precodifica.
Indice della matrice di precodifica PMI raccomandato dalla UE secondo RI.
CQI: Indicatore della qualità del canale; [W(LR1]
Il metodo di codifica e modulazione appropriato viene ottenuto secondo PMI, per ciascun feedback
Codeword.
LI: Indicatore di livello.
LI viene utilizzato per inviare segnali di riferimento PT-RS sullo strato più forte nel downlink.
SSB-RI: indicazione risorsa blocco SS/PBCH; [W(LR2].
5
6
7
Network architecture of the 5G
8
9
5G Time Domain Resources-1
Information Summary
Through this article, you can learn

What are the origins of the basic time units Tc and Ts in 5G systems?

What is inter-symbol interference and inter-subcarrier interference? Why extended cyclic prefixes
can solve this problem?

What are the formats and types of time slots? What are the discussions in the industry?

What is a mini-slot and what is the description of a mini-slot in the protocol?
Overview
The time and frequency resources of the 5G system are similar to those of LTE, including the time and
frequency domains. 3GPP has redesigned the time and frequency resources of the air interface on the basis
of LTE when developing the 5G protocol so that the network capacity of 5G can meet the system design goals
and the QoS requirements of various services in the system while taking into account the smooth evolution
from LTE to 5G. This paper introduces the concepts related to time-domain resources in 5G systems from the
following aspects.
1. Numerology
2. Physical layer time units
3. Frame structure
4. OFDM symbols and CPs
5. Time slot types and formats
6. Time slot configuration
Numerology
5G NR has a new concept of Numerology compared with LTE. Numerology includes SCS (Sub-Carrier Spacing),
and the corresponding parameters such as symbol length and CP (Cycle Prefix) length. Since there is a certain
mapping relationship between SCS and symbol length and CP length, SCS is usually used instead of
Numerology.
5G system compared with LTE:

Service types will be richer, with three major service types defined in the 3GPP protocol: eMBB
(enhanced Mobile BroadBand), URLLC (Ultra-Reliable Low-Latency Communications), and mMTC
(massive Machine Type Communications).

There are also more spectrum and bandwidth options available.

Faster mobile speed terminals are supported.
Therefore, in terms of SCS design, 5G NR takes the fixed 15kHz SCS in LTE as the basis and extends it according
to 15kHz x 2μ to obtain a series of variable SCSs to adapt to different service requirements and channel
characteristics. For example, as the user moves faster, the Doppler frequency deviation generated by its
received signal will also be larger. In this case, in high-speed mobile scenarios, subcarriers with larger SCSs
10
can be used. Under a certain Doppler frequency bias, the larger the subcarrier interval, the smaller the impact
caused by the frequency bias.
In spectrum planning, 5G takes into account that some spectrum is already occupied and divides the
spectrum resources into:

FR1: i.e. Sub-6G band, the frequency range is generally below 6GHz, which is usually called 5G low
frequency.

FR2: i.e. Millimeter wave band, the frequency range is generally between 24GHz and 52GHz, usually
called 5G high frequency.
Note:
The latest 3GPP 38.104 protocol defines the FR1 band, whose maximum frequency has been extended to
7125MHz.
The types of SCS supported in different frequency bands are not exactly the same, and the Numerologies
defined in 3GPP 38.211 protocol are shown in the table.
μ
SCS
CP
Supported
Range
0
15
Normal
FR1
1
30
Normal
FR1
2
60
Normal, Extended
FR1, FR2
3
120
Normal
FR2
4
240
Normal
FR2
Frequency
As can be seen from the table, 5G NR currently supports five types of SCS (μ takes the value of 0~4). Among
them, the smallest SCS is 15 kHz and the largest is 240 kHz. three SCSs are supported at low and high
frequencies, of which 60 kHz is the only SCS supported at both FR1 and FR2 frequency ranges, and the only
SCS supported at extended CP.
Physical layer time units
5G NR defines two basic time units Tc and Ts, where Ts is defined in the same way as the time unit defined
in LTE, while Tc is the smallest time unit in 5G NR. Any concept about time length in 5G NR can be expressed
as an integer multiple of Tc.
11
Ts is defined as follows.
Tc is defined as follows.
There exists a multiplicative relationship K between Ts and Tc.
How to go about understanding the definition of Ts and Tc? It can be discussed from the Ts defined in LTE.
LTE SCS is fixed at 15 kHz, the maximum transmission bandwidth of the number of subcarriers is 1200, so in
the IFFT, the frequency domain sampling points can not be less than 1200 to ensure that information is not
lost, and the power of 2 in the digital system is more convenient to calculate, so Nf finally takes the value of
2048.
After understanding the 2048 origin, it is easier to understand the time unit Ts. Since the SCS of LTE is 15 kHz,
the length of the data portion of an OFDM symbol is the reciprocal of that, i.e., 1/15000 of a second. 2048
samples in the frequency domain mean 2048 samples in the time domain, and the length of a symbol divided
by 2048 gives the sampling interval, Ts.
The Tc in 5G NR is similar to Ts. Since the maximum number of subcarriers on the transmission bandwidth
defined in NR is 3276 (because the maximum number of RBs supported by the system is 273, with 12 fixed
subcarriers per RB), 4096 sampling points are chosen. Tc.
Careful friends may have noticed that the current 3GPP protocol defines the maximum SCS as 240kHz, while
480kHz is not defined, and in fact, the 480kHz SCS is reserved for the subsequent evolution of the 5G protocol.
12
Whether it is LTE or 5G NR, the sampling frequency is always the same, only LTE has only one SCS, so the
OFDM symbol length is unchanged, and the number of sampling points per OFDM symbol is also unchanged,
but NR has multiple sub-carrier intervals, so the OFDM symbol length is not fixed, so the number of sampling
points per OFDM symbol is also not fixed.
Frame Structure
The frame structure of 5G NR follows the basic framework of LTE, as shown in the figure.
Included:
Frame: fixed length of 10ms, frame number range: 0~1023.
Subframe: length is fixed to 1ms, subframe number range: 0~9.
Slot: when Normal CP is used, the length is 14 symbols. Since the symbol length is not fixed, the time slot
length is also not fixed.
Symbols: length is not fixed and related to SCS.
NOTE:
When the SCS is 60 kHz, Extended CP can also be used, when the time slot length is 12 symbols.
The time unit of scheduling in LTE data domain is a subframe, i.e. 1ms; while the time unit of scheduling in
5G NR data domain is time slot, the length of which is fixed, but the length of symbols is related to SCS; the
larger the SCS, the shorter the symbols, the shorter the time slot, and the shorter the time interval of
schedule, which is also more suitable for low latency service scenarios.
13
Taking the SCS of 30kHz and 120kHz respectively as an example, the relationship between frames, subframes,
time-slots, and symbols is shown in the figure.
The quantitative relationships between SCS, frames, subframes, time-slots, and symbols are given in 3GPP
38.211 protocol are shown in the table below.
Quantitative relationships between frames, subframes, time-slots, and symbols under Normal CP
μ
Number of symbols per Number of time slots per Number of time slots per
time slot
frame
subframe
0
14
10
1
1
14
20
2
2
14
40
4
3
14
80
8
4
14
160
16
14
Quantitative relationships between frames, subframes, time-slots, and symbols under Extended CP
μ
Number of symbols per Number of time slots per Number of time slots per
time slot
frame
subframe
2
12
40
4
OFDM symbols and CP
The OFDM symbols of 5G NR are similar to LTE in that they both consist of two parts, the data symbol, and
the cyclic prefix. As mentioned in the introduction of Tc, SCS determines the length of the data symbols and
indirectly the symbol length and CP length. And the number of symbols per time slot is fixed, so SCS also
determines the length of the time slot. As shown in the figure.
Due to the phenomenon of refraction and reflection of radio waves during transmission, signals from multiple
paths at the receiving end may be received from the transmitting end, and there is a time delay difference
between the received signals on different paths. As shown in the figure above, the first symbol of the second
path falls into the time domain of the second symbol of the first path, causing mutual interference between
symbols. At this point, the inter-symbol interference generated by multipath can be avoided by inserting CP,
i.e., inserting a protection time interval between each symbol, as long as the length of the protection time
interval is greater than the maximum time delay extension in multipath. In practical applications, the further
the distance between the transmitter and the receiver, the larger the multipath delay expansion will be, so
the longer the CP, i.e., the smaller the SCS, the stronger the subcarrier coverage capability.
In addition, due to multipath transmission, the signal waveform at the receiving end will be distorted thus
destroying the orthogonality between subcarriers, which will lead to inter-subcarrier interference. At the
same time, if the protection interval inserted is a blank interval, it will also destroy the orthogonality of
subcarriers. Therefore, when inserting the protection interval, the last period of sample points of each data
symbol is chosen to be copied to the front of that data symbol, and this waveform is called a cyclic prefix.
This ensures that when a copy of the signal generated by multipath is received, the waveform in its FFT period
is also a complete period to ensure the orthogonality between subcarriers, thus reducing the interference
between subcarriers.
15
In the FR2 frequency range, i.e., the millimeter-wave band, the minimum SCS supported is 60 kHz. compared
to the 15 kHz SCS that can be used in the FR1 frequency range, the shorter data symbol length in the
millimeter-wave band leads to a shorter length of its inserted normal CP, and thus the weaker the ability to
combat multipath effects. 5G NR addresses this situation by introducing a 60 kHz SCS with Extended CP, with
a length of 4.17 μs, is applied in scenarios where the delay extension due to multipath is large.
Time slot types and formats
The 3GPP 38.213 protocol classifies slots into three types.

Downlink: denoted by the letter D, used for downlink transmission.

Uplink: denoted by the letter U, used for uplink transmission.

Flexible: denoted by the letter F, can be used for uplink or downlink transmission, and can also be
used as GP (Guard Period) or reserved resources.
Three types of OFDM symbols are also defined, downlink, uplink, and flexible, and each slot can be freely
combined by these three types of symbols to form multiple slot formats. The 3GPP 38.213 protocol "11.1 UE
procedure for determining slot format" gives details of the supported slot formats.
According to the slot format defined by the protocol, the industry further expands the classification of
Flexible type slots and proposes the concept of Mixed slot. Mixed slot contains at least one or more
"downlink" or "uplink" symbols, and the existence of "flexible" symbols in the slot. In this case, the slot type
is divided into four Cases.

Case 1: DL-only slot

Case 2: UL-only slot

Case 3: Flexible-only slot

Case 4:Mixed slot
Case 4 can be further divided into several sub-cases, as shown in the figure.
It can be seen that the 5G NR slot format design can achieve symbol-level changes in both uplink and
downlink data, whereas in LTE only subframe level changes can be achieved. This design is more flexible, and
also makes the slot type more abundant to adapt to the service type in different scenarios.
Self-contained time slots
Case4-3 and Case4-4, also known as Self-contained slots, correspond to the two structures of self-contained
slots respectively.
DL-dominant slot
16
In Case4-3, this slot is
mainly used for downlink
data transmission, while
a small number of
symbols are used for
uplink control signals
(e.g., HARQ feedback for
this downlink data) or SRS
through
time-division
multiplexing,
thus
reducing the downlink
HARQ feedback delay.
UL-dominant slot
The slot is mainly used for uplink data transmission, while a small number of symbols are used for
transmission of downlink control signals (e.g.,
uplink scheduling instructions in PDCCH) through
time-division multiplexing, thus shortening the
uplink scheduling delay.
In the design of a self-contained time slot, both BTS and UE need to perform the conversion of uplink and
downlink transmission within a slot and ensure the normal operation after the conversion by reserving
protection time and not transmitting or receiving any signal during the protection time. Currently, Huawei
only supports DL-dominant slots.
Mini-slot
In order to further reduce the latency of airports, 3GPP 38.912 protocol proposes the concept of a mini-slot,
i.e. Case4-5. unlike slots in other cases, the time-domain length of the mini-slot is less than 14 symbols. minislot has a finer
division of the time
domain compared
with basic slot
scheduling, and the
scheduling latency
is also shorter. The
scheduling of minislot is often called
non-slot-based scheduling, as shown in the figure.
17
mini-slot is generally used in the following scenarios.

Short-latency scenario: The most typical one is the URLLC service scenario.

Unlicensed band deployment scenario: After listening to the spectrum legally available, data
transmission can be started faster.

Millimeter-wave band deployment scenario: multiple users can perform TDM multiplexing in a basic
slot.
The 3GPP 38.214 protocol defines in detail the number of symbols under non-slot based scheduling, which
can be any number of symbols from 2 to 13. Currently, Huawei supports the non-Slot test feature, and the
number of symbols supported by this feature in the upstream and downstream directions is as follows.

Downlink direction: PDSCH supports 2/4/7 symbols.

Upstream direction: PUSCH supports 2/4/8 symbols.
However, the Non-Slot feature cannot be equated with mini-slot, because 3GPP 38.912 protocol proposes
that mini-slot can be applied in FR2 frequency range and can support 1 symbol length, while the Non-Slot
feature only supports FR1 frequency range.
Time slot configuration
For TDD LTE systems using a time division duplex, 3GPP 36.211 protocol defines a 7-subframe allocation
scheme with frame granularity and a transition period of 5 ms or 10 ms. In 5G NR, the number of slots per
frame is no longer fixed and strongly correlated with SCS, so when using time division duplex communication,
NR needs to define a configuration scheme for the number of uplink and downlink slots in one transition
period, which is no longer fixed and correlated with SCS. In 5G NR, the number of slots per frame is no longer
fixed and strongly correlated with SCS.
18
Note:
A conversion cycle is one in which a continuous upstream and downstream transmission transition is
experienced and the protection time must be configured for both upstream and downstream transmission
transitions.
The 3GPP 38.213 protocol defines a four-layer time slot configuration scheme.

Layer 1: Cell-specific SIB1 message semi-static configuration with SCS-dependent transition periods
including periods {0.5, 0.625, 1, 1.25, 2, 2.5, 5, 10}ms.

Layer 2: UE-specific RRC signaling semi-static configuration, with a conversion period related to SCS,
including periods {0.5, 0.625, 1, 1.25, 2, 2.5, 5, 10}ms.

Layer 3: UE-group SFI (slot format indicator) dynamic configuration with SCS-dependent transition
period including periods of {1, 2, 4, 5, 8, 10, 20} slots.

Layer 4: UE-specific DCI dynamically configured, where the conversion period is fixed at 1 slot.
Note:
In the Layer 3-time slot configuration scheme, SFI is implemented via DCI format 2_0. In the Layer 4-time slot
configuration scheme, the current 3GPP protocol defines the DCI formats that support this configuration
scheme, including format 0_0, format 0_1, format 1_0, format 1_1, and format 2_3.
When the first layer configuration scheme is adopted, it can realize a unified static time slot configuration for
the whole network, i.e. a framework time slot configuration structure. The second, third and fourth layer
configurations can be flexibly adjusted for different UEs with finer granularity according to their different
service requirements in different scenarios. The comparison of the characteristics of the above four-layer
configuration scheme is shown in the table.
Configuration options
Features and Configuration Priorities
Layer 1:Cell-specific+SIB1
Features: Cell-level + static or semi-static resource configuration.
Resource configuration priority: highest, i.e. Cell-specific identified as part
of D or U. The configuration of other layers cannot be modified.
Layer 2:UE-specific+RRC
Features: UE-level + static or semi-static resource configuration.
Resource configuration priority: higher, i.e. further configuration can be
done for the part identified as F in the Cell-specific configuration, but the
part identified as D or U in the configuration of that layer, the configuration
of layer 3 and layer 4 cannot be modified.
Layer 3:UE-group+SFI
Features: UE-level or UE-group level + periodic dynamic configuration.
Resource configuration priority: lower, the further configuration can be
done for the part identified as F in the first or second layer configuration,
but the part identified as D or U in that layer configuration cannot be
modified in the fourth layer configuration.
Layer 4:UE-specific+DCI
Features: UE-level + slot-level dynamic configuration.
19
Configuration options
Features and Configuration Priorities
Resource configuration priority: low, further configuration can only be
done for the part of the layer 1, layer 2, or layer 3 configuration identified
as F.
5G NR supports multi-layer nested configurations as well as independent configurations for each layer, as
shown in the figure. When the multi-layer nested configuration is used, the next layer can only do further
configuration on the part of the previous layer that is configured with the F attribute. Currently, Huawei only
supports the first layer configuration.
Multi-layer nested configuration schematic
Standalone configuration schematic
20
5G technology-phased array antenna technology
What is an antenna array?
The antenna array is arranged according to some geometric rules to form a group of antennas, where the
single antenna is called "array elements".
Line Array
A set of antennas arranged in a straight line called a line array, the most common is a uniform line array.
"Uniform" means that the distance between the elements of the array is equal, the antenna "uniform
arrangement". Of course, there are also non-uniform line arrays.
The directional map of the line array is one-dimensional, generally applied to the horizontal plane (azimuth).
Planar Array
The most common planar array is the uniform circular array (usually called a uniform circular array), others
are L-shaped, rectangular, polygonal, circular planar, etc.
The orientation map of the planar array is two-dimensional, including the azimuth and elevation angles.
21
Stereo Array
A stereo array is a 3-dimensional (3D) array. The simplest type of 3D array can be formed by stacking multiple
uniform circular arrays. The application of 3D arrays in 5G large-scale MIMO has received much attention in
recent years.
Phased Array
Phased arrays were the first antenna arrays to gain practical application, mainly in radar - phased-array radar.
The main purpose of phased arrays is to realize the spatial scanning of the array beam, which is called
electrical scanning.
The phased array forms the spatial beam pointing of the antenna array through the phase (or time delay)
control (time delay adjustment) of each antenna array element channel. Constantly changing the phase
control of each array element can realize the scanning of the beam to the air domain without rotating the
antenna.
Phase control of phased arrays is usually implemented in RF, for analog implementation. In addition to phase
control, gain (e.g., using a controlled attenuator) control is sometimes performed. Compared to digital
beamforming, phased arrays require only an RF link with up- and down-conversion and ADC, and the
hardware cost is low, especially when the number of array elements is large.
The phased-array antenna technology advancement will lead to the 5G beam fusion technology later.
22
5G Frequency Domain Resources-1
Summary of information
In this article, you can learn.

What are the duplex modes in the 5G NR bands and why are all the duplex modes in the FR2 bands
TDD?

What are the basic frequency domain resources in 5G NR and how do they differ conceptually from
LTE?

What is the role of Synchronization Raster and why is the concept of Synchronization Raster
introduced in 5G NR?

What is the BWP and why is it introduced in 5G NR and how does the UE get its BWP information
from the network side? What does Huawei currently support?
Overview
Time and frequency resources for the 5G system's air interface are similar to those of LTE, including the
time domain and frequency domain. In the article "5G Time Domain Resources", we introduced the
concepts related to the time domain in 5G NR. In this paper, we will follow up on "5G Time Domain
Resources" and introduce the concepts related to frequency domain resources in 5G systems from the
following aspects.

Division of 5G spectrum and frequency bands

Concepts related to frequency domain resources

Concepts related to Bandwidth

Channel Raster and Synchronization Raster

Introduction to BWP
5G spectrum and frequency band division
In the field of wireless communication, spectrum is always a scarce resource. Generally speaking, the lowfrequency spectrum is a high-quality resource, but because of its small bandwidth, it was quickly occupied
by various communication systems in previous generations. when 3GPP developed the 5G standard and
planned its spectrum, in order to meet its design goal of large bandwidth, in addition to planning to occupy
the low-frequency spectrum not allocated by ITU, its spectrum planning was also extended to high
frequency. The entire 5G NR spectrum can be divided into two parts, FR1 and FR2.

FR1: the main band of 5G, also known as the Sub-6G band, is generally below 6 GHz and is often
called the low frequency of 5G. With the continuous evolution of the protocol, the latest 3GPP
38.104 V16.2.0 has extended the maximum frequency of FR1 to 7125MHz, which is more than
6GHz, but it is still agreed to call FR1 as Sub-6G band. For FR1, frequencies below 3GHz are also
referred to as Sub-3G band, and the rest is referred to as C-Band. Since most of the spectrum
resources below 3GHz have been occupied by previous generations of communication systems,
only a relatively small number of countries and regions have planned continuous spectrum with
larger bandwidth for 5G NR in this spectrum range; while the C-Band band, for most countries
around the world, is not occupied by existing communication systems, and it can divide a larger
23
bandwidth and continuous spectrum for 5G NR. Therefore, in the ITU WRC-15 meeting, the
preliminary conclusion is that C-Band will be the potential global unified 5G band.

FR2: The extended frequency band for 5G, also known as the millimeter-wave band, is also
commonly referred to as 5G HF. The latest 3GPP 38.104 V16.2.0 defines the frequency range for
FR2 between 24 GHz and 52 GHz. However, as the protocol continues to evolve, higher frequency
spectrum resources planned for 5G NR have been discussed in relevant 3GPP working groups, and
in the concluded WRC19 meeting, it was proposed that the 5G millimeter wave band will be
extended to 71 GHz.
The spectrum division of 5G NR is shown in the figure.
According to some discussions in the industry, the spectrum resource application strategy for 5G can be
divided into three categories from the capacity and coverage perspective.

For dense urban areas with high capacity claims, such as airports and CBDs, the use of the
millimeter-wave band, FR2 spectrum deployment can be considered.

For general urban areas with both coverage and capacity requirements, C-Band, i.e. FR1 3GHz or
higher, can be considered.

For suburban/rural areas with higher coverage requirements, consider using the original lowfrequency band, generally FR1 Sub-3G spectrum.
FR1 and FR2 are only two broad ranges of 5G NR spectrum division, and since in these two ranges of
spectrum resources, other communication systems have already occupied some spectrum each, and the
actual situation is still different in each country. Therefore, 3GPP has divided the spectrum resources of
FR1 and FR2 into several bands, taking into full consideration the current spectrum resource usage and
future planning of each country in the world, so that each country can choose the appropriate band
according to its available spectrum when deploying its own 5G network. And before introducing the band
division, it is also necessary to introduce an important concept in the communication system: duplex mode.
Duplex mode
The concept of the duplex is relative to simplex. Simplex refers to the ability to transmit data in one
direction only. One of the two communicating parties is fixed as the sender and the other is fixed as the
receiver. Duplex means that either side of the communication can be both the sender and the receiver.
When data cannot be received while sending data or when data cannot be sent while receiving data, it is
called half-duplex. And when data can be received while sending data, it is called a full duplex.
24
Starting from LTE, the air port resources are time-frequency resources.

When the uplink and downlink communication of base station and user use different frequency
resources, it is called frequency division duplex, i.e. FDD.

When the uplink and downlink communication of the base station and the user use the same
frequency resources, only distinguished by time domain resources, it is called time-division
duplexing, or TDD.
The duplex mode of 5G NR follows the design concept of LTE, but unlike LTE, from the perspective of system
division, LTE will emphasize duplex mode, such as TDD-LTE and FDD-LTE. on the contrary, 5G rarely
mentions duplex mode at the system level due to its flexible slot type and format design.
Note:
Whether it is LTE or 5G NR, for TDD system, although the resources received and sent by the same terminal
are distinguished in the time domain, but from the perspective of the granularity of wireless frames or the
actual experience of the terminal, the terminal can "simultaneously" send and receive data, so we usually
consider TDD system as a Full duplex system.
5G NR Frequency Banding
Like LTE, 5G NR divides the spectrum resources into several bands, which are numbered in the form of
"n+number". The duplex mode of each band must be clear from the spectrum division point of view. 5G
NR defines the duplex modes of the bands as follows.

FDD: frequency division duplexing.

TDD: Time-division duplexing.

SDL: auxiliary downlink, i.e. the band only supports transmission in the direction of the gNodeB to
the UE.

SUL: auxiliary uplink, i.e. the band only supports transmission from the UE to the gNodeB direction.
The current 3GPP 38.101-1 and 38.101-2 protocols divide the frequency bands of FR1 and FR2 as shown in
the table. 45 frequency bands are defined in FR1 and 4 frequency bands are defined in FR2. And all the 4
bands defined in FR2 are TDD and no FDD mode.

The band range of FR2 is high frequency, and its coverage capability will be poor. Therefore, 5G NR
will use beam assignment to improve the coverage capability of the system when a large number
of antenna units are used. The beam assignment requires the gNodeB to obtain the CSI information
of the user, while for TDD since the uplink and downlink are in the same frequency band, the CSI
of the downlink can be obtained directly through the uplink measurement on the gNodeB side by
using the reciprocity of the uplink. TDD can reduce the need to obtain the downlink channel
information in order to obtain the downlink channel information by using the reciprocity of the
uplink and downlink compared to FDD. gNodeB allocates a large amount of downlink reference
signal resources for the UE and the resulting resource waste.
25
It should be noted that in 5G HF, Huawei currently relies on the user's CSI feedback to obtain downlink
channel state information for beam management.

Among the four frequency bands defined in FR2, the smallest one has a bandwidth close to 1 GHz,
and if the band can also be used in FDD mode, it means that a band with the same bandwidth size
needs to be found for uplink, which also causes some difficulty for planning. At the same time, the
band defined by FR2 can support a maximum bandwidth of 400MHz carrier, which is generally used
for large data volume business scenarios, especially in hotspot areas, and usually, this kind of
business form is asymmetric (generally speaking, the downlink service demand will be higher than
the uplink), so the uplink bandwidth demand is definitely not as strong as the downlink at this
time. And TDD itself is more suitable for asymmetric service scenarios because of its dynamic ratio
of upstream and downstream resources, which can reduce the waste of upstream bandwidth due
to insufficient uplink data volume compared with FDD.
Banding of FR1
NR operating band
Uplink (UL) operating band
Downlink (DL) operating band
BS receive
BS transmit
UE transmit
UE receive
FUL_low – FUL_high
FDL_low – FDL_high
n1
1920 MHz – 1980 MHz
2110 MHz – 2170 MHz
FDD
n2
1850 MHz – 1910 MHz
1930 MHz – 1990 MHz
FDD
n3
1710 MHz – 1785 MHz
1805 MHz – 1880 MHz
FDD
n5
824 MHz – 849 MHz
869 MHz – 894 MHz
FDD
n7
2500 MHz – 2570 MHz
2620 MHz – 2690 MHz
FDD
n8
880 MHz – 915 MHz
925 MHz – 960 MHz
FDD
n12
699 MHz – 716 MHz
729 MHz – 746 MHz
FDD
n14
788 MHz – 798 MHz
758 MHz – 768 MHz
FDD
n18
815 MHz – 830 MHz
860 MHz – 875 MHz
FDD
n20
832 MHz – 862 MHz
791 MHz – 821 MHz
FDD
n25
1850 MHz – 1915 MHz
1930 MHz – 1995 MHz
FDD
n28
703 MHz – 748 MHz
758 MHz – 803 MHz
FDD
n29
N/A
717 MHz – 728 MHz
SDL
n30
2305 Mhz – 2315 MHz
2350 MHz – 2360 MHz
FDD
n34
2010 MHz – 2025 MHz
2010 MHz – 2025 MHz
TDD
n38
2570 MHz – 2620 MHz
2570 MHz – 2620 MHz
TDD
n39
1880 MHz – 1920 MHz
1880 MHz – 1920 MHz
TDD
26
Duplex Mode
NR operating band
Uplink (UL) operating band
Downlink (DL) operating band
BS receive
BS transmit
UE transmit
UE receive
FUL_low – FUL_high
FDL_low – FDL_high
n40
2300 MHz – 2400 MHz
2300 MHz – 2400 MHz
TDD
n41
2496 MHz – 2690 MHz
2496 MHz – 2690 MHz
TDD
n48
3550 MHz – 3700 MHz
3550 MHz – 3700 MHz
TDD
n50
1432 MHz – 1517 MHz
1432 MHz – 1517 MHz
TDD
n51
1427 MHz – 1432 MHz
1427 MHz – 1432 MHz
TDD
n65
1920 MHz – 2010 MHz
2110 MHz – 2200 MHz
FDD
n66
1710 MHz – 1780 MHz
2110 MHz – 2200 MHz
FDD
n70
1695 MHz – 1710 MHz
1995 MHz – 2020 MHz
FDD
n71
663 MHz – 698 MHz
617 MHz – 652 MHz
FDD
n74
1427 MHz – 1470 MHz
1475 MHz – 1518 MHz
FDD
n75
N/A
1432 MHz – 1517 MHz
SDL
n76
N/A
1427 MHz – 1432 MHz
SDL
n77
3300 MHz – 4200 MHz
3300 MHz – 4200 MHz
TDD
n78
3300 MHz – 3800 MHz
3300 MHz – 3800 MHz
TDD
n79
4400 MHz – 5000 MHz
4400 MHz – 5000 MHz
TDD
n80
1710 MHz – 1785 MHz
N/A
SUL
n81
880 MHz – 915 MHz
N/A
SUL
n82
832 MHz – 862 MHz
N/A
SUL
n83
703 MHz – 748 MHz
N/A
SUL
n84
1920 MHz – 1980 MHz
N/A
SUL
n86
1710 MHz – 1780 MHz
N/A
SUL
n89
824 MHz – 849 MHz
N/A
SUL
n90
2496 MHz – 2690 MHz
2496 MHz – 2690 MHz
TDD
n91
832 MHz – 862 MHz
1427 MHz – 1432 MHz
FDD
n92
832 MHz – 862 MHz
1432 MHz – 1517 MHz
FDD
n93
880 MHz – 915 MHz
1427 MHz – 1432 MHz
FDD
n94
880 MHz – 915 MHz
1432 MHz – 1517 MHz
FDD
27
Duplex Mode
NR operating band
Uplink (UL) operating band
Downlink (DL) operating band
BS receive
BS transmit
UE transmit
UE receive
FUL_low – FUL_high
FDL_low – FDL_high
2010 MHz – 2025 MHz
N/A
SUL
Uplink (UL) operating band
Downlink (DL) operating band
Duplex Mode
BS receive
BS transmit
UE transmit
UE receive
FUL_low – FUL_high
FDL_low – FDL_high
n257
26500 MHz–29500 MHz
26500 MHz–29500 MHz
TDD
n258
24250 MHz–27500 MHz
24250 MHz–27500 MHz
TDD
n260
37000 MHz–40000 MHz
37000 MHz–40000 MHz
TDD
n261
27500 MHz–28350 MHz
27500 MHz–28350 MHz
TDD
n95
Duplex Mode
Banding of FR2
NR operating band
Concepts related to frequency domain resources
Some of the concepts of frequency domain resources in 5G NR are similar to LTE, but some of them are
newly introduced in 5G NR. We will introduce these concepts one by one.
RE/RB/RG
RE (Resource Element) is the smallest granular physical layer resource of 5G NR, which is 1 subcarrier in
the frequency domain and 1 OFDM symbol in the time domain, as in LTE.
RB (Resource Block) is the basic unit of channel resource allocation in the frequency domain of 5G NR,
which contains 12 subcarriers in the frequency domain as in LTE, but the subcarrier spacing is variable in
5G NR, so the actual RB bandwidth is also variable. Another special note is needed.

When defining RB in LTE, it is clear that it is 1-time slot on its time domain, that is, 7 OFDM symbols
for ordinary CP and 6 OFDM symbols for extended CP. And the time slot length of LTE is fixed, 1
subframe contains two-time slots, i.e. the time slot length is 0.5ms. as the scheduling TTI of LTE is
1ms, the time domain is including 2-time slots, so actually, the scheduling granularity of LTE is RB
pair, i.e. 2 RBs from the time domain. just we usually directly ignore the dimension of time domain
in various RB statistics and calculation process, such as. "For example, the number of RBs used per
TTI in a cell is actually "number of RBs" as "number of RB pairs".

When 5G NR mentions the concept of RB, the 3GPP protocol only mentions the frequency domain
as a one-dimensional perspective and does not involve the time domain dimension. From the
perspective of scheduling in 5G NR, its scheduling TTI is 1-time slot, but the absolute time slot
length of 5G NR is variable and related to Numerology, and this concept has been introduced in
"5G Time Domain Resources", so it is not mentioned here.
28
RG (Resource Grid) is a collection of time-frequency resources, which is defined as follows in 5G NR: for
different Numerology on each carrier, an RG is a collection of resources for all sub-carriers in the frequency
domain and all symbols under 1 subframe length in the time domain, and the starting point of the
frequency domain is granular in RB. Since different Numerology corresponds to different SCS, and an RB is
fixed to 12 subcarriers, so for the same transmission bandwidth, the number of RBs contained in RG is
different in different Numerology. And RG is fixed to 1 subframe in the time domain. Also, uplink and
downlink each define their own RGs separately.
The resource division of RE, RB, and RG is shown in the figure.
CRB/PRB/RBG
CRB (Common Resource Block) is a generic term for all the RBs in 5G NR, numbered from 0. The frequency
point of subcarrier 0 in CRB0 is also Point A. Point A will be introduced in the following section.
29
PRB (Physical Resource Block) refers to the RBs contained in the BWP of a UE in 5G NR, which are also
numbered from 0 and belong to the basic unit of data channel scheduling.
RBG (Resource Block Group) is a combination of several PRBs in a BWP, also numbered from 0, and is the
basic unit of data channel scheduling. An RBG can contain {2, 4, 8, 16} PRBs, the exact number of which
depends on the number of RBs in the BWP and the configuration options, as shown in the figure.
REG/CCE
REG (Resource Element Group) is the basic unit of control channel resources. 1 REG in the frequency
domain for 12 subcarriers, that is, the width of a PRB, in the time domain for 1 OFDM symbol.
CCE (Control Channel Element) is the basic unit of control channel resource scheduling, 1 CCE in the
frequency domain consists of 6 REG.
The relationship between REG and CCE is shown in the figure.
A comparison of the concepts related to the frequency domain resources of 5G NR and LTE is shown in the
table.
30
Frequency Does
Domain involve
Concepts
NR Does
involve
LTE The difference between NR Note
and LTE
RE
Yes
Yes
No difference
None
RB
Yes
Yes
1-time slot in the time domain, None
i.e. 0.5ms, is explicitly stated in
LTE
The explanation on the time
domain is not explicitly given
in 5G NR
RG
Yes
Yes
No difference
A time-domain length of the
RG in LTE is one slot, that is, 0.5
ms. A time-domain length of
the RG in 5G NR is one
subframe, that is, 1 ms.
CRB
Yes
No
No definition of CRB in LTE
None
PRB
Yes
Yes
No difference
None
RBG
Yes
Yes
No difference
None
REG
Yes
Yes
No difference
In LTE, one REG consists of four
REs, and in 5G NR, one REG
consists of 12 REs.
CCE
Yes
Yes
No difference
In LTE, one CCE consists of nine
REGs, and in 5G NR, one CCE
consists of six REG.
31
5G Frequency Domain Resources-2
Bandwidth-related concepts
As already mentioned in the banding of 5G, the entire spectrum resource is divided into several bands. But
in fact, these bands are defined based on the current spectrum resource usage and future planning of each
country, so we will find that the spectrum ranges of some bands will even overlap. For the wireless
communication system, if the channel is considered as a highway, it is certainly hoped that the width of
the highway conforms to one or a limited number of standards. The width of this standard is what we
often refer to as the standard channel bandwidth or carrier bandwidth.
5G NR, like LTE, does not have all the spectrum resources of standard channel bandwidth for data
transmission. A portion of the spectrum resources to the left and right of a standard bandwidth will be
divided as a protection band to avoid interference from outside the channel bandwidth during data
transmission. The spectrum resources that are actually available for data transmission are called
transmission bandwidths. Therefore, the spectrum utilization for a certain standard bandwidth can be
defined as spectrum utilization = maximum transmission bandwidth/standard bandwidth. The relationship
between channel bandwidth, transmission bandwidth, and protection bandwidth is shown in Fig.
In LTE system, the maximum transmission bandwidth as a percentage of standard bandwidth is fixed, i.e.
90%, and the same means that a fixed 10% of standard bandwidth size will be used for bandwidth
protection. In contrast, F-OFDM modulation is used in 5G NR, and for different SCS and standard bandwidth
sizes, the percentage of maximum spectrum resources used for data transmission will vary with the SCS
and standard bandwidth size. 3GPP 38.104 protocol defines the standard bandwidth types and their
percentage of maximum transmission bandwidth, as shown in the table below.
32
Channel bandwidth size and transmission bandwidth share supported by FR1
CS 5 MHz 10
(kHz)
MHz
15
30
60
15
MHz
20
MHz
25
MHz
30
MHz
40
MHz
50
MHz
60
MHz
70
MHz
80
MHz
90
MHz
100
MHz
NRB
NRB
NRB
NRB
NRB
NRB
NRB
NRB
NRB
NRB
NRB
NRB
NRB
25
52
79
106
133
160
216
270
N/A
N/A
N/A
N/A
N/A
90%
93.6% 94.8% 95.4% 95.8% 96%
97.2% 97.2% N/A
N/A
N/A
N/A
N/A
11
24
106
189
217
245
273
38
51
65
78
133
162
79.2% 86.4% 91.2% 91.8% 93.6% 93.6% 95.4% 95.8% 97.2% 97.2% 97.7% 98%
98.3%
N/A
11
135
N/A
79.2% 86.4% 86.4% 89.3% 91.2% 91.8% 93.6% 94.8% 95.7% 96.3% 96.8% 97.2%
18
24
31
38
51
65
79
93
107
121
Channel bandwidth size and transmission bandwidth share supported by FR2
SCS (kHz)
60
120
50 MHz
100 MHz
200 MHz
400 MHz
NRB
NRB
NRB
NRB
66
132
264
N/A
95%
95%
95%
N/A
32
66
132
264
92.2%
95%
95%
95%
As can be seen from the table above.

Compared to LTE, 5G NR removes the standard definition of bandwidth below 5MHz.

The smaller the SCS, the higher the spectrum utilization will be.

The larger the channel bandwidth, the higher the spectrum utilization will be.

The standard bandwidths of 50MHz and 100MHz are supported by both FR1 and FR2.
In addition, readers familiar with 3GPP may find that when the actual protocol defines certain parameters
involving the number of RBs, the maximum value is usually 275 instead of the maximum number of RBs in
the table, which is 273. Therefore, we can simply understand that 275 RB is an ideal value for the design
goal, while the maximum RB of each SCS in the above table is a realistic value based on the actual situation.
Channel Raster and Synchronization Raster
5G NR is the same as LTE, the first signal received by the UE at power-on is the PSS and SSS from the cell,
thus achieving downlink time synchronization with the cell. And in LTE, since it is stipulated that PSS and
SSS are always located at the center of the carrier, an LTE terminal that finds PSS/SSS, that is, also knows
33
the center frequency of the carrier. In fact, we can think that not every frequency point can be used as the
carrier center frequency of the cell, and the carrier center frequency of the cell can be used as the carrier
center frequency of the cell. The minimum frequency interval that can be set is also called Channel Raster,
and the size of the minimum frequency interval is 100kHz in LTE.
Channel Raster diagram
The concept of Channel Raster in 5G NR is the same as that of LTE, but the minimum frequency interval
will be less than 100 kHz in some frequency bands. This results in a longer search time and higher
performance requirements for the terminal. Therefore, in order to search the cell PSS and SSS faster, 5G
NR adopts a different approach from LTE, where SSB (PSS+SSS+PBCH) is no longer in the center of the
carrier, and there is a limited set of possible locations in each band, also called Synchronization Raster, and
the UE performs SSB search in these sparse and specific possible locations The UE performs SSB search at
these sparse and specific possible locations. Similar to the understanding of Channel Raster,
Synchronization Raster can be considered as the minimum frequency interval of the center frequency point
of SSB.
The position of PSS and SSS in LTE and 5G NR in the carrier is shown in Fig.
Synchronization Raster divides a carrier bandwidth into a finite number of locations where SSBs can be
deployed, and 3GPP numbers all possible SSB center frequency points across the NR spectrum, called GSCN
(Global Synchronization Channel Number). The conversion relationship between the center frequency
34
point of the SSB and the GSCN is given in 3GPP 38.101-1 and 3GPP 38.101-2 protocols, as shown in the
following table.
Frequency range
SS Block frequency position SSREF
GSCN
Range of GSCN
0 – 3000 MHz
N * 1200 kHz + M * 50 kHz,
3N + (M-3)/2
2 – 7498
7499 + N
7499 – 22255
22256 + N
22256 – 26639
N=1:2499, M ϵ {1,3,5} (Note 1)
3000 – 24250 MHz
3000 MHz + N * 1.44 MHz
N = 0:14756
24250 – 100000 MHz
24250.08 MHz + N * 17.28 MHz,
N = 0:4383
NOTE 1: The default value for operating bands with which only support SCS spaced channel raster(s) is
M=3.
As can be seen from the table, the higher the frequency range, the larger the Synchronization Raster value,
and in the frequency range of 24250 - 100000 MHz, the minimum frequency interval of the SSB center
frequency point is 17.28 MHz. this is because, in the high-frequency range, 5G NR can deploy larger channel
bandwidth This is because in the high-frequency range, 5G NR can deploy cells with larger channel
bandwidths, so a larger Synchronization Raster is required to ensure that the UE can search for SSBs
quickly.
Synchronization Raster leads to a problem: the value of Synchronization Raster is not necessarily an integer
multiple of the CRB bandwidth, so the subcarrier at the starting position of the blind search SSB in the
frequency domain is not necessarily aligned with the boundary of the CRB in the data domain, as shown in
the figure. Therefore the UE also needs to know this offset at the initial access, which is broadcasted to the
UE by the NR cell in the MIB. 3GPP uses KSSB to represent this offset (KSSB has other roles, which can be
described in 3GPP 38.213 protocol), so the UE can obtain the frequency domain location information on
the CRB at the lowest position overlapped with the SSB through KSSB The frequency point of subcarrier 0
on this CRB. This frequency point also called the reference point, will be one of the important information
for the UE to determine Point A from the network side and further determine the BWP location based on
Point A.
And the definition of the KSSB value range in 3GPP 38.211 is still different in different frequency ranges, so
the calculation of the corresponding reference point is as follows.
35

For FR1, the KSSB will be fixed in 15 kHz SCS and take the value range of 0~23.
Frequency of the reference point = frequency of the lowest carrier of the SSB - KSSB * 15kHz

For FR2, KSSB will be in units of SCS on the data domain, and the range of values is 0~11.
Frequency of reference point = frequency of SSB lowest carrier - KSSB * subCarrierSpacingCommon
Note:
In the NSA networking scenario, the UE can also obtain SSB information directly through the NR-ARFCN
Introduction to BWP
In LTE, it is required that the transmission bandwidth on the UE side and the transmission bandwidth
configured on the eNodeB side must be the same. And in 5G NR, the 3GPP protocol specifies that the
system can support a larger transmission bandwidth. However, due to the diversity of 5G services, from
the UE's perspective, some services specific to a UE may not require a larger transmission bandwidth, while
supporting a larger transmission bandwidth also means higher terminal costs, so the 3GPP protocol
proposes the concept of BWP (BandWidth Part) for 5G NR.
The BWP is a continuous spectrum resource allocated to the UE on the network side, and the UE transmits
data on the BWP, which can be smaller than the maximum transmission bandwidth on the network side,
thus achieving flexible transmission bandwidth allocation on the network side and the UE side. It should
be noted that BWP is a UE-level concept, i.e., different UEs can configure different BWPs. UEs do not need
to know the transmission bandwidth of gNodeB, but only need to know the BWP information configured
to them by gNodeB.
Application scenarios of BWP
The application scenarios of BWP are as follows.

Scenario #1: It is applied to small bandwidth capable UEs to access large bandwidth networks,
which can reduce the cost of UEs.

Scenario #2: UE switches between large and small BWPs to achieve power saving effect.

Scenario #3: A BWP can only correspond to one Numerology, but for different BWPs, different
Numerologies can be configured to carry different services
36
Classification of BWP
For the access state of the UE, the BWP of the UE can be classified as Initial BWP and Dedicated BWP.
where.

Initial BWP is mainly used for data transmission when the UE is in the initial access state.

Dedicated BWP is mainly used for data transmission after the UE is in the RRC connection state.
In terms of transmission direction, the BWP of the UE (including Initial BWP and Dedicated BWP) can be
further divided into DL BWP and UL BWP, which are used for data transmission in the downlink and uplink
directions respectively.
The 3GPP 38.211 protocol specifies that a UE can be configured with four different Dedicated BWPs, but
only one of them is allowed to be activated as Acitive BWP. When this timer expires, gNodeB will switch
the BWP of the UE back to a Default BWP (if defined in the configuration) or an Initial BWP, as shown in
the figure below. Both the Acitive BWP and Default BWP are Dedicated BWP.
Currently, Huawei products only support up to 2 different Dedicated BWPs, one of which is called Full
Bandwidth BWP, i.e. the bandwidth of this BWP is the transmission bandwidth of the entire carrier wave.
The other BWP is called Power Saving BWP, which requires the energy-saving features of the terminal to
be in effect before it is configured to the UE, and is only supported in the TDD band of FR1.
37
Download