At the moment, the trumpet to blow seems to be that film and games production are converging. If you believe the hype, we'll be in some sort of nirvana where we all use the same approaches etc to the art of making both.
Granted, we use similar tools: Photoshop in one hand and maya/max/xsi in the other. And yes, all of our efforts end up being viewed on an HDTV. This, however, is where Convergence comes to a grinding halt.
Show me a games artist who needs to camera track or even cares? What VFX artist needs to worry about atomics or engine performance or cares? Films are a linear story event where someone points a camera at something. Games, on the other hand, are non-linear. For as long as this remains the case, the process to generate both will never converge.
The techniques of a VFX artist and that of the games artist are similar - that is to say they model, they light etc. However, how a model is generated for a game, with all its LODs, collision models, and much more, is very different to modelling in VFX.
The UV's for a VFX model have to perform to camera (sometimes) but for a gamer they have to work for every viewing angle. This theme runs through the entire process: from colour pipelines, to linear floating point compositing and rendering, in VFX, games level design, bosses and HDR environments.
At the end of the day, as a 3D artist you end up either making films or games. Yes, some of your skills transfer, but the art and process of making both is still far from converging.