Sorry but we have no such estimation tool. I think it is impossible for the computer to estimate those things without running the graph first. I can give you a few tips though.
1) Memory space depends on the number of input records and their size and the way you process them. (the components used, CTL functions used, efficiency of processing, number of graph phases, ...) Especially components like sorters, joiners or some others which keep all records in memory can have high influence on the overall memory requirements.
2) Temp space depends on the number of debug edges, buffered edges and size of the buffers, the number of temp files you create and the number of records stored in them.
3) Graph run log shows the amount of memory used. Exploring a graph log from a run in development environment, you can get a hint about the memory requirements for production.
4) You can also use tools like jvisualvm to see and examine JVM behavior during the graph run. If you see the results for a graph with 100 records processed and the other one for 200 records, you should be able to estimate the results for 300 records.
I know I provided you only with a high level view on the topic so do not hesitate to ask specific things you are interested in.