Apologies for the lengthy story, this one is for people professionally acquainted with cellular sites...

I recently went camping with the family, and of course, a few people really needed their internet. There is no signal in the hole this campground sits in. Anyhow, I knew the signal was really good up the hill 100 yards from the camp site so I took my fishing pole and a hookless spoon and casted a line over a high tree branch and taped the phone to the lure to raise it up in the air 30 feet. Suddenly, "tree wifi" was up and running well. This started a whole new project...

I have known about passive repeaters (yes, i know, not repeaters at all) for years. The thought was to take my old aluminum 10m 5/8 vertical, run a coax through the center, and add a directional antenna to the top and bottom such that it can be stuck into a truck box pocket and used to fill in the camp site with signal. Everything sounded good until theory turned to doing it. Ive been doing some research into how cell sites work and it seems that certain directions have different frequencies. But, while driving around monitoring the cellular band, it seemed apparent that the further from the site I got (irrespective of direction), there was a tendency to shift into a lower frequency.

It makes plenty of sense that 1GHz+ would do great close range, but going any distance through dense forest would require sub-GHz frequencies.

So the question is this: Are cell sites designed to allow lower frequency coverage in all directions, or do they only direct the lower frequencies to the areas with a higher density of rural customers?

Also, being there are so many 4G LTE bands, can anyone recommend a wide-bandwidth directional antenna? I've ran the numbers on LPDA's and the feeder impedance is unrealistically low to achieve what I need. At this point, I am considering a linear parabolic reflector with a small telescoping dipole in the focal point in case I drive 4 hours to find it was 850MHz rather than the 722 ATT uses here.

Thanks!