Can someone give me a formula to calculate rate of decent using ground speed and desired amount of altitude and time. It would be nice to be able to calculate at how many miles away from airport to begin a decent say at 1000ft per second.
Here's what is taught straight from my airline's manual. For your top of descent point just take your amount of altitude to lose and multiply that number by three. This will give you about a 3 degree descent angle. For example, you're at 10,000 ft. and the airport of landing is at 1000 ft. Take your 9000 ft of altitude to lose and multiply by 3 giving you 27 miles out as our top of descent point. It's always a good idea to add a little fudge factor to this number to account of winds aloft etc. So for this example start out of 10000 about 30 miles out. As far as what rate of descent to use just multiply your groundspeed at the top of the descent by a factor of 5 and that should give you a good ballpark number to shoot for.
Personally when I'm flying for real I just take break it down into how many miles a minute I'm flying in groundspeed taking into account how much altitude I need to lose and what rate of descent I want to use. So in this example let's say we're at FL280 and need to get to 10000 ft. at a certain fix. We're doing 420 across the ground which breaks down into 7NM miles a minute(420/60). I need to lose 18000 ft so at 2000 ft per minute of descent that's 9 minutes of descent. If I'm doing 7 miles/minute and I need at least 9 minutes to descend using a rate of 2000fpm then I need to start down at least 63 miles out. Again I always add a fudge factor so I'd make it an even 70 miles out in this case. This approach lets you vary your descent rate at your choosing.
Hope I didn't confuse you.