Among your key responsibilities as a Computer Graphics Supervisor may be the design or implementation of a CG pipeline. Even if you do not design one, you will manage one, and as the pipeline manager your concern will be to keep it running efficiently. Further, even if you do not design the pipeline you manage, you will be accountable to management to insure the pipeline design and staff capabilities are most appropriate for current needs.
Your task will be compounded by a very simple problem: jargon. The word “pipeline” is a rather meaningless term. Like so many terms in computer graphics, it has been adopted from some other discipline and the word crafted to mean something for a specific situation. You will soon find that there is little literature on the subject and that the word “pipeline” means different things within computer graphics to different people.
I’m thinking about how anyone could think a crew of 10-20 artists could have spent the last 12 months making around 1800 VFx shots without a very functional pipeline. Describing an asset management system….
The term “pipeline” may seem to be a term grounded in the antiquity of computer graphics if you’ve only been around a few years, but it is a rather recent bit of jargon, introduced 10-12 years ago. The word “pipeline” replaced “workflows”, which in turn replaced the terms “cg process” and “cg production phases” around 20 years ago. The change in terminology has both opened and constricted our thinking about visual effects production processes and the people who make them happen.
The term “pipe” was introduced in computer jargon with the development of UNIX. A UNIX pipe is an input-output data structure that eliminates the need for program A to write a stream of data to a file that will be read and processed by program B. Instead, the operating system command line language allowed the insertion of a pipe symbol, which meant, “connect the output of program A to program B”, skipping writing the file to save time and disk space.
A few years later, a young company making a computer optimized for graphics, Silicon Graphics designed and sold graphics processors using a “pipeline” concept. Essentially, the graphics engine developed by SGI encapsulated in hardware, at a great speed advantage, the UNIX concept of piping the data through a series of simply written modules making a series of small operations into a significant result.
Over time the concept of moving computer data through a series of software modules would get translated into a work process involving progressive the work product through a series of specialized workers (who happen to use node-based software and other technologies all based on the software pipe concept). And so “CG process” became “CG workflows” and these became “CG pipelines”….
For some time I had been thinking of the pipeline in a very narrow sense, as a type of CG work-flow or description of the CG process. Suddenly I saw that a pipeline could be the flow of any kind of data, not just the flow of our work product, and that the CG pipeline we used consisted of a multitude of different classes of pipelines that touched one another in places. In an instant, concepts I was thoroughly familiar with became redefined and reorganized. Like parallel universes, pipelines coexisted within my production environment in different dimensions of perspective.
To Read More Tutorials goto...