The way Google crawls you site is by following it’s links, and there is the root of your problem. Google algorithems are all about saving time. And having to index the same page a few times just waists their time. Not only that, following redirects and digging too deep into a site is considerd a waist of time as well.
A better description of the problem
Lets start with redirects. When you have too many redirects you cause Google’s spider to follow tens if not hundreds of links to get to the actual pages. A process that take time away from crawling other sites. Thank of watching a movie that’s 30 minutes long but having to watch a 30 minutes of blank screen just to get to the movie. You probably won’t except that, what makes you think Google will?
At the moment they are known to drop rankings for this flaw, but in my opinion they will stop crawling those pages all together. Which brings me to my next point, link depth. When you build your site you set up categories, sub categories, and the list goes on. This is the same bureaucracy we all complain about, but still we use it in our site.
To get to a page you sometimes need to dig five, six, and sometime 20 or 30 links deep. Do you see that waist of time. You depend on Google to index it so that the visitor will only have one link to this page, from the serp. But Google has stopped indexing pages like that a long time ago. Today if you page is more then 2 or 3 links deep it’s not getting indexed.
What can you do?
First of all check you’re links. Make sure they lead to the next page and not redirect to another. This is a lot of work I agree, but one you only have to go though once in a while. Secondly make sure all you sites pages are accessible from the home page, find a faster way to reach every page in a way that wont disc urge the visitor or the spider crawling your site.
Finally set a checkup date once every month to make sure you’re site’s structure is neat. Some times this alone will raise your rankings.