Aspen Movie Map

The Aspen Movie Map was a revolutionary hypermedia system developed at MIT by a team working with Andrew Lippman in 1978 with funding from ARPA.

Features

The Aspen Movie Map enabled the user to take a virtual tour through the city of Aspen, Colorado (that is, a form of surrogate travel). It is an early example of a hypermedia system.

A gyroscopic stabilizer with four 16mm stop-frame film cameras was mounted on top of a car with an encoder that triggered the cameras every ten feet. The distance was measured from an optical sensor attached to the hub of a bicycle wheel dragged behind the vehicle. The cameras were mounted in order to capture front, back, and side views as the car made its way through the city. Filming took place daily between 10 a.m. and 2 p.m. to minimize lighting discrepancies. The car was carefully driven down the center of every street in Aspen to enable registered match cuts.

The film was assembled into a collection of discontinuous scenes (one segment per view per city block) and then transferred to laserdisc, the analog-video precursor to modern digital optical disc storage technologies such as DVDs. A database was made that correlated the layout of the video on the disc with the two-dimensional street plan. Thus linked, the user was able to choose an arbitrary path through the city; the only restrictions being the necessity to stay in the center of the street; move ten feet between steps; and view the street from one of the four orthogonal views.

비디오 이미지 위에 겹쳐진 동적으로 생성된 메뉴를 통해 상호작용을 제어했다. 즉, 터치 스크린 인터페이스를 통해 적절한 아이콘을 선택하여 속도와 보기 각도를 수정했고, 이는 유비쿼터스 인터랙티브 비디오 키오스크의 전조였다. 사용자 입력과 오버레이 그래픽을 처리하는 클라이언트 프로세스에서 데이터베이스에 접근하여 레이저디스크 플레이어를 제어하는 서버로 명령이 전송되었다. 또 다른 인터페이스 특성은 현재의 시야에서 어떤 건물이라도 만질 수 있는 기능이었고, 웹 브라우저의 ISMAP 특성과 유사한 방식으로 그 건물의 전면으로 뛰어드는 기능이었다. 선정된 건물에는 인테리어 사진, 역사 이미지, 식당 메뉴, 시 공무원 화상면접 등 추가 데이터가 포함돼 있어 사용자가 해당 건물을 통해 가상 투어를 할 수 있었다.

건물의 전면은 3D 모델에 질감 맵핑이 적용되었다. 추가 데이터에 대한 하이퍼링크를 제공하기 위해 동일한 3D 모델을 사용하여 2D 화면 좌표를 건물 데이터베이스로 변환했다.

이후 구현에서는 애니메이션 데이터베이스에서 자동 추출된 메타데이터가 상당 부분 아날로그 비디오에서 디지털 신호로 암호화되었다. 각 프레임에 인코딩된 데이터에는 완전한 기능을 갖춘 대리 여행 경험이 가능하도록 필요한 모든 정보가 담겨 있었다.

이 시스템의 또 다른 특징은 프레임 상단의 수평선 위에 겹쳐진 내비게이션 지도였다. 지도는 둘 다 사용자의 현재 위치(이전에 탐색한 거리의 흔적뿐만 아니라)를 나타내고 사용자가 2차원 도시 지도로 점프할 수 있도록 하는 역할을 했다. 이 지도는 다른 이동 방법을 가능하게 했다.h 도시. 지도 인터페이스의 추가 특징으로는 경로와 랜드마크를 강조하여 상관관계가 있는 항공사진과 만화 렌더링 사이를 왔다 갔다 할 수 있는 기능, 그리고 찰스 임스파워스 오브 텐 영화를 확대/축소할 수 있는 기능 등이 있다.

아스펜은 초가을과 겨울에 촬영되었다. 사용자는 길을 따라 이동하거나 파사드를 보면서 필요에 따라 환절기를 할 수 있었다. 폴 헥베르트가 설계한 알고리즘을 이용해 랜드마크 건물 전면의 3차원 텍스처 맵핑을 특징으로 하는 Quick and Dirty Animation System(QADAS)을 이용한 도시의 3차원 다각형 모델도 탄생했다. 레이저디스크에도 저장되어 있는 이러한 컴퓨터 그래픽 이미지들도 비디오와 상관관계가 있어 사용자는 도시의 추상적인 렌더링을 실시간으로 볼 수 있었다.

Credits

MIT undergraduate Peter Clay, with help from Bob Mohl and Michael Naimark, filmed the hallways of MIT with a camera mounted on a cart. The film was transferred to a laserdisc as part of a collection of projects being done at the Architecture Machine Group (ArcMac).

The Aspen Movie Map was filmed in the fall of 1978, in winter 1979 and briefly again (with an active gyro stabilizer) in the fall of 1979. The first version was operational in early spring of 1979.

Many people were involved in the production, most notably: Nicholas Negroponte, founder and director of the Architecture Machine Group, who found support for the project from the Cybernetics Technology Office of DARPA; Andrew Lippman, principal investigator; Bob Mohl, who designed the map overlay system and ran user studies of the efficacy of the system for his PhD thesis; Richard Leacock (Ricky), who headed the MIT Film/Video section and shot along with MS student Marek Zalewski the Cinéma vérité interviews placed behind the facades of key buildings; John Borden, of Peace River Films in Cambridge, Massachusetts, who designed the stabilization rig; Kristina Hooper Woolsey of UCSC; Rebecca Allen; Scott Fisher, who matched the photos of Aspen in the silver-mining days from the historical society to the same scenes in Aspen in 1978 and who experimented with anamorphic imaging of the city (using a Volpe lens); Walter Bender, who designed and built the interface, the client/server model, and the animation system; Steve Gregory; Stan Sasaki, who built much of the electronics; Steve Yelick, who worked on the laserdisc interface and anamorphic rendering; Eric "Smokehouse" Brown, who built the metadata encoder/decoder; Paul Heckbert worked on the animation system; Mark Shirley and Paul Trevithick, who also worked on the animation; Ken Carson; Howard Eglowstein; and Michael Naimark, who was at the Center for Advanced Visual Studies and was responsible for the cinematography design and production.

The Ramtek 9000 series image display system was used for this project. Ramtek created a 32 bit interface to the Interdata for this purpose. Ramtek supplied image display systems which supplied square displays (256x256 or 512x512) as its competition did but also screen matches such as 320x240, 640x512 and 1280x1024. The original GE CAT Scanners all used the Ramtek 320x240 display. Some prices of the day may be on interest. A keyboard, joystick or trackball would each sell for around $1,200. A 19" CRT had a OEM price of around $5,000 and this would be purchased from Igagami in Japan. The production of a single CD master (around 13") was $300,000.

Purpose and applications

ARPA funding during the late 1970s was subject to the military application requirements of the Mansfield Amendment introduced by Mike Mansfield (which had severely limited funding for hypertext researchers like Douglas Engelbart).

The Aspen Movie Map's military application was to solve the problem of quickly familiarizing soldiers with new territory. The Department of Defense had been deeply impressed by the success of Operation Entebbe in 1976, where the Israeli commandos had quickly built a crude replica of the airport and practiced in it before attacking the real thing. DOD hoped that the Movie Map would show the way to a future where computers could instantly create a three-dimensional simulation of a hostile environment at much lower cost and in less time (see virtual reality).

While the Movie Map has been referred to as an early example of interactive video, it is perhaps more accurate to describe it as a pioneering example of interactive computing. Video, audio, still images and metadata were retrieved from a database and assembled on the fly by the computer (an Interdata minicomputer running the MagicSix operating system) redirecting its actions based upon user input; video was the principal, but not sole affordance of the interaction.

See also

Further reading

  • Video The Interactive Movie Map: A Surrogate Travel System, January 1981, The Architecture Machine, at the MIT MediaLab Speech Interface Group; Youtube copy.
  • Bender, Walter, Computer animation via optical video disc, Thesis Arch 1980 M.S.V.S., Massachusetts Institute of Technology.
  • Brand, Stewart, The Media Lab, Inventing the Future at MIT (New York: Penguin Books, 1989), 141.
  • Brown, Eric, Digital data bases on optical videodiscs, Thesis E.E. 1981 B.S., Massachusetts Institute of Technology.
  • Clay, Peter, Surrogate travel via optical videodisc, Thesis Urb.Stud 1978 B.S., Massachusetts Institute of Technology.
  • Heckbert, Paul, "질감 매핑의 조사", IEEE 컴퓨터 그래픽 애플리케이션, 1986년 11월, 페이지 56–67.
  • 리프먼 앤드류 "영화맵: 컴퓨터 그래픽에 광학 비디오디스크의 적용,"제7회 컴퓨터 그래픽과 인터랙티브 기법에 관한 연례 회의의 진행, 1980년, 페이지 32-42.
  • Mohl, Robert, interactive movie map: 가상 환경의 공간 학습에 대한 조사, 1982년 논문 아치, 매사추세츠 공과대학교
  • Naimark, Michael "Aspen the Best: Musings on Heritage and Virtuality," 존재: 텔레 오퍼레이터 및 가상 환경, 가상 유산에 관한 특별 이슈, MIT 프레스 저널, 제15권, 제3권, 2006년 6월.
  • Yelick, Steven, Anamorphic 이미지 처리, Statement E.E. 1980 B.S., Massachusetts.

참조

  • 마노비치, 레프, 뉴미디어란 무엇인가? 뉴미디어 언어 (매사추세츠: MIT 프레스, 2001), 페이지 259-260.

외부 링크