Increased Signal Amplitude due to Multipath - Question
Last Post: February 13, 2008:
-
Hi,
I'm studying for my CWNA right now and have a question about something I read in the CWNA v3.0 study guide. On pg. 399 it says that
"It is important to understand that a received RF signal can never be as strong as the signal that was transmitted, due to the effect of Free Space Path Loss"
This is in regards to multipath and how sometimes multipath can cause an increase in signal amplitude. My question is as follows:
I can see how the received signal at a station far away from an AP can never be greater than the transmitted signal. But what if I were to strategically place metal around an AP and have a station 1 foot away from it, wouldn't the Free Space Path Loss be negligible then and thus the increase in amplitude due to multipath could possibly be larger than the transmitted signal amplitude? -
There are many things to consider here, reflection, phase, upfade, downfade, multipath and intended use. So, here's a quick look.
The increase in amplitude or upfade can happen if the reflected signals arrive in phase with the main signal at the same time as the main signal. The multipath issue, as a problem, occurs when the reflected signals arrive out of phase with the main signal. If a reflected signal arrives at the same time as the main signal and is 180 degrees out of phase with the main signal, the effect is a nulling of the signal. Outdoor wireless mesh routers often depend on the upfade caused multipath to improve signal quality. The combined power of the reflected signals may be stronger than the main signal alone. Multipath can have good or bad effects on signal quality depending upon the timing of the signal and reflected signal(s) arrival and if the reflections are in or out of phase with the main signal. -
Hi, thanks for responding to my message. I understand that there are many different things that need to be considered. You mentioned
The combined power of the reflected signals may be stronger than the main signal alone.
I take it that you're referring to how the reflected signals at some point b far away from the AP can be stronger than what it would have been at the same point b if there was no reflection. This is comparing the signal strength at some point to what it would be without upfade, etc.
However, I'm trying to figure out whether at some point b very close to the AP, can upfade for example cause the signal strength (in mW) to be greater than the power radiated by the antenna on the AP? -
The loss 1 wavelength from the antenna is 22 db.
From there, each time you double the distance you lose another 6 db.
You can calculate the wavelength from the following equation:
Speed of Light = frequency x wavelength
Speed of Light = 300,000 km per second
Answer is in meters -
The strongest power measurement will be the EIRP. The receiver will get the main signal at a reduced power due to free space path loss and any interference. In phase reflections increase the received signal strength but the result will still be a lower power than the original trasmission measured at the transmitter.
-
Bryan,
I think the point here is that EIRP cannot be measured, only calculated. Take the transmit power, subtract the cable loss and add the antenna gain to get the EIRP. The first measurement can be taken 1 wavelength from the antenna. It shoud be the EIRP minus 22 db (Inverse Square law).
180 degree phase cancelation is rare, as is gain from receiving the signal from 2 seperate multpaths perfectly in phase.
MIMO uses DSP to cancel out any multipath and combining the energy per symbol (via maximum ratio combining) of multiple receive antennas to produce gain, thereby improving the total energy per symbol, thereby improving the possible throughput. -
Thanks for all your replies, so I gather that it's not probable that the received power is higher than the transmitted, but does anyone think that it's possible? As in... I don't know why this is bugging me, but I'm trying to find a mathematical way of showing that this is impossible or something like that... or is it only that it is improbable that multipath causes perfectly in phase signals to increase in amplitude beyond that transmitted, but not impossible? :-?
-
nb123,
I have always thought of it as "the sum of the parts can not be greater than the whole".
Even if a reflected wave arrived exactly in phase with the original wave the combined power cannot be greater than the original wave.
Think of putting hot water in a sink. If you put a pan of water on the stove and bring it to a boil and dump it into the sink you now have the maximum water temperature (this represents the original power of the RF wave at the transmitter).
If you add hot tap water (reflected out of phase multi-path) it will mix with the hotter water already in the sink and you will reduce the overall temperature of the water in the sink.
Now if you wanted really hot water you drain the water and refill the sink with another pot of boiling water. If we wanted to have really really hot water we boil another pot of water and pour it into the sink. (a reflected wave perfectly in-phase). The water does not get hotter, it is exactly the same temperature as it was before we added the second pot of boiling water. We cannot raise the temperature of the water by adding more water of the same temperature as what is already in the sink.
As long as the temperature of the water in the sink is lower than the temperature of boiling water we can increase the temperature of the sink water by adding boiling water to it, but if the water is already boiling hot we can?¡é?€??t get it any hotter no matter how much boiling water we add.
So as long as a RF wave is at a lower power than when it left the transmitting antenna the power can be raised by in-phase multipath signals adding to its power, but it can never be raised above it?¡é?€??s original power level no matter how many in-phase signals are added to it.
Someone a whole lot smarter than me can prove it with the math, but to me it just makes sense. -
When we talk about signal, we talk about amplitude.
Hi nb123,
As signal propagates, amplitude of the signal decreases due to free space loss (major factor for signal loss), reflection, interference etc as said before.
Lets assume signal has an amplitude of say 2 units when it comes out of the antenna eventually it fades to 1 unit. But when another wave due to multipath and in phase to the previous signal arrives at 2 units they both will add up again to provide amplitude of 2 units.
Which means there would not be a gain in signal more than the received signal but would more or less match the signal that was transmitted from the transmitter
thanks
- 1