Proposal
Table of Contents:
URI Summer Research Award Application:
Many audiences and outside observers are aware of technological advances in the areas of lighting control, in the expanding role digital technology has played in audio, and of the proliferation of large mechanical special effects in commercially staged events in this country ranging from Broadway musical theater to Las Vegas casino shows. Many of the artists, crafts-people, and technicians who collaborate in performance consistently want to push the boundaries of their discipline but unfortunately lack the financial resources of a commercial enterprise. As one of these collaborators, the motivation for my research is to identify any technology that offers newer and broader possibilities to artists in collaborative performance. I feel it is possible, given technology readily available, to develop and deliver sophisticated technical solutions to performing artists and organizations in a greater economic range. Live performance of the theatrical event has consistently relied on its experts in technology, the technical designers, to create and support the visual and dynamic aspects of physical design for production. Historically, technical designers associated with non-commercial performance venues have rarely had the resources to develop and produce the elaborate mechanical scenic effects sought by the scene designers and directors of collaborative performance events. Commercial production on the other hand, has generally relied on individual and expensive engineered solutions to scenic problems at a cost and sophistication not readily available to most production organizations. Advances over the last ten years in micro-processor control and power-switching electronics have resulted in a new market of relatively inexpensive but highly sophisticated control equipment that is readily available "off the shelf" for use in industry at large. By adapting this technology to the performing arts along with technology already used by performers I have the opportunity to open new avenues for production design and collaboration. It is useful to have an understanding of the term control in reference to automated mechanical effects on stage. In general, the technical designer wants to control the motion characteristics of a scenic piece. Simply stated, this means control of starting, stopping, direction, position, acceleration, deceleration, and speed. This may be accomplished by people, machines, or some combination of the two. Traditionally, live performance has opted for human-powered movement of scenic effects, but motorized or remotely powered systems are employed on stage under one or more of the following circumstances:
Machines, when properly designed and operated are ideally suited to perform and regulate these motions. There are two sources for the technology I will draw from in this project. Although stage-craft embraces technical processes peculiar to the discipline alone, theater technologists have looked to industry at large for solutions to many staging problems. A great deal of the available technology in the field of motion control has been developed to automate manufacturing processes, and the result is a proliferation of reasonable cost components from which engineers may choose. Where industrial designers may set a process working that may repeat identically for weeks, months or years, the challenge for the technical designer in theater production is to develop a system of motion control that can quickly be changed to accommodate the dynamic nature of the rehearsal and performance process. The second source of technology from which I will draw for this project already comes out of the performing arts, namely MIDI (Musical Instrument Digital Interface). MIDI, simply defined, is a protocol that encodes or simulates the basic movements of a performer at an electronic musical instrument. A variety of physical instruments can be encoded, but the most common is the piano keyboard. Most any other MIDI device that stores digital or electronic sounds can play back or be triggered by the encoded movements of the performer. The versatility of this protocol lies in the fact that this play-back or triggering can take place remotely, both in time and location, from the original performer. In this phase of my research on automated motion control for the stage, I am looking to develop the portion of the control system between the operator/stagehand and the motor drive, motor, and mechanical equipment that actually moves the scenic effect. My previous work has focused on the technology of motor drives. These devices are briefly defined below. I will use MIDI technology to provide a performance-friendly interface to the industrial components that form the core of the electro-mechanical laboratory. I have chosen to model a typical motorized cable winch driven system. Electrically motorized winches are the mainstay of machinery used to move scenery on the stage. These winches generally consist of large electric motors and gearboxes driving rotating drums which draw wire rope(s) in and out. The wire rope (cable), which is fixed to the moving piece, pulls the effect in the desired direction. Typical applications for cable winches include moving platforms and similar scenic units across stage, flying pieces in and out from overhead, and producing rotation of circular turntables. The motor/gear combination that turns the winch is driven by a piece of electronic equipment known as a motor drive. Motor drives, when installed in line between the electric power supply and the terminals of an electric motor, allow operation at variable speeds. Motor drives are usually configured to accept electronic signals from other devices that may modify parameters of the motor’s operation, with speed being the most important of these parameters. The exact nature of the interface between the operator/stagehand and this aspect of the motor drive is my goal for this phase of research into the control of stage machinery. My recent investigation has lead me to a variety of devices that convert MIDI signals into the simple voltage signals the motor drive uses to control the motor’s speed. Since motor drives are usually manufactured with keypad interfaces that allow an operator to only pre-program speed and direction and not easily alter and adjust "on the fly" as is needed in live performance, an alternative interface is called for. I tell my students that operating such a machine would be analogous to manipulating the accelerator and brake of an automobile with a telephone keypad. A significant inducement to the use of MIDI devices is that they are designed with the performer in mind. That is, a MIDI capable musical keyboard sends out electronic signals representing (among other things) note ON, note OFF, and the velocity or force of the key-press. In my proposed control system, these "musical" parameters are translated into commands for mechanical motion--the key-press initiates motion, the relative force of the key-press indicates the desired speed, and release of the key signals stop. Alternate keys, then, select alternate directions of movement. The operator of the scenic effect will actually "play" the device. The ability to "play" allows the operator/stagehand to actively participate in the performance beyond simply pushing a START and STOP button. This person can adapt scenic movement to the other dynamic elements of live production, namely the performers, the lighting, and the audio score. Should predictable and repeatable motion be required from performance to performance, MIDI output may be recorded and replayed. Most personal computers can be configured easily and inexpensively to perform this function. Similarly, there are numerous software programs already available which allow editing of these recorded MIDI sequences. Since the MIDI protocol and its associated devices are already well understood and accepted within the performance and entertainment industry, much of the technology outside the realm of moving scenery already implements MIDI. Manufacturers of audio consoles, lighting consoles, robotic lighting fixtures, pyrotechnic effects, and projection equipment all offer products that "speak" MIDI. The adaptation of this standard allows simultaneous control of the different equipment noted above. For example, execution of a single lighting sequence can trigger a change in lighting as well as starting a film projector, prerecorded audio tape, and rotation of a turntable platform on the stage. In order to provide even greater integration of the elements of production, a master protocol named MIDI Show Control (MSC) has been adopted. MSC allows either a master "stage manager’s" console or individual piece of equipment to not only send a GO or STOP command to any other, but to monitor the status of each effect as well . This has significant positive affect on the safety of adapting the technology to complex and potentially dangerous effects. Description of Research Process The research process for this project involves the following steps:
These steps in the research process form the majority of the work I can finish before and during the granting period. Certainly the majority of the construction and testing must take place in the summer session. During that period, I can devote extended work in the workshop and laboratory apart from the significant demands of teaching and administration associated with the long semesters. Further, I feel I can accomplish better research if I am not bound to the deadline of an "opening night" as I would be if developing this idea as part of an individual production. The work will take place in the new electro-mechanical laboratory and workshop I am developing as part of the Department of Theater and Dance and COFA Performing Arts Center. Much of equipment I need is already available in the College of Fine Arts or through equipment allocations available from the Department of Theater and Dance. I intend to pursue additional sources for funding as indicated previously in the application packet. I expect that all additional equipment needs can be met through these funds. Once the prototype of the control system is in place at the end of the grant period, I will test it in an appropriate College of Fine Arts Production as part of the 1997-98 academic year. I will also submit my work to various professional regional theaters across the U.S. where I hope to further refine the system. Given the need for new implementations of this kind of technology, I expect that publication in a professional journal in the field of technical theater is a reasonable expectation.
go to : top |
|
© Fritz Schwentker -- 26 August 2004 |