CineGrid International Workshop 2015 Archive and 2016 and 2017 Announcements

CineGrid International Workshop 2017 - TBD 
CineGrid International Workshop 2015 - Class Photo
The CineGrid International Workshop 2015 was held December 8-10, 2015, at the Qualcomm Institute @ Calit2-UCSD. 
Over 100 international participants enjoyed cutting-edge demonstrations and stimulating presentations at the three-day event.

15th Annual Global LambaGrid Workshop

Taking a Bohemian view on R&E networking
18 October 2015 -- The 15th Annual Global LambaGrid Workshop was held on 28-30 September 2015 at the Hotel International, Prague, Czech Republic. More than 110 participants from around the world came to discuss the visions for future research and education networking, whilst also honouring the 20th anniversary of the founding of the host organisation CESNET, the Czech National Research and Education Network. The event was sponsored by Ciena, Cisco and Juniper (Gold Sponsors), ComSource and ICS Intercom Systems (Bronze Sponsors), with support also provided by České Radiokomunikace.
After the opening welcome from Jan Gruntorad, Director of CESNET, the keynote was provided by Erik-Jan Bos of NORDUnet who discussed interconnecting research and education networks to facilitate ubiquitous guaranteed bandwidth and best efforts services on a global scale. This include a vision for intercontinental networking architecture building on existing initiatives such as ANA-200G and AmLight based around open exchange points. Jiri Navratil then discussed some practical example of how the GLIF initiative had enabled CESNET to develop very high bandwidth video hardware and utilise that technology in nearly 20 countries round the world. He stressed the importance of experimental networks that could enable applications that could not be supported by existing production networks.
The opening plenary session then heard about other developments in the GLIF community, including from David Wilde (AARnet) who presented on how GLIF was being used in conjunction with SDN and the Openstack Cloud computing platform to support Australian research; and from Jiri Melnikov (CESNET) on the issues of using latency critical applications over long distances. Cees de Laat (University of Amsterdam) then presented the Pacific Research Platform built around interconnected Science DMZs of top research institutes that are designed to create secure enclaves for data-intensive science and high-speed data transport separate from general-purpose network infrastructures.
The following afternoon and morning sessions were devoted to the Technical Working Group. This opened with updates on the new open exchange in Singapore (SLIX), the forthcoming ANA-200G between Europe and North America, the Hawaiki cable between Hawaii, New Zealand and Australia, and the new SDX initiative at Pacific Wave. There were also updates on new deployments in Eastern Europe and the Netherlands.
This was followed by regular meetings of the AutoGOLE, NSI Implementation and Performance Verification Task Forces which are developing the components that support the dynamic provisioning and monitoring of lightpath networks, setting the scene for a discussion the next day on the requirements for Global Open Lightpath Exchange (GOLE) operators and Software Defined Exchanges. The session continued with a more in-depth exploration of the Global Network Architecture topic that was raised during the keynote, considering the requirement for both point-to-point, point to multi-point and overlay services that could reach anywhere in the world, before concluding with a presentation on the RINA architecture (Leonardo Bergesio, i2CAT) which aims to unify networking and distributed computing.
At the end of the first day, several demonstrations were organised at the venue. This included a 4K video of orchestration using SDN driven switching, real-time high-bandwidth data streaming from a remote controlled robotic vehicle, read/write to storage at 100 Gb/s bit rates using control plane signalling to create paths over multiple backbones, and a showcase of international SDXs using extensions to support programmable services over multiple network domains. The ongoing work of the GLIF Automated GOLE pilot that can set-up inter-domain lightpaths on demand was also demonstrated, along with the use of InfiniBand over long distances.
Several meetings had been held the day before the workshop, including the Governance Working Group (chaired by Kees Neggers) that approved the budget for 2016, the hosting proposals for GLIF 2016, and the process for selecting a new GLIF Chair. There were also meetings of the OGF NSI Working Group (chaired by Guy Roberts, GÉANT Association), the GLIF Americas Working Group (chaired by Joe Mambretti, StarLight International/National Communications Exchange), and the GLORIAD project (chaired by Greg Cole, GLORIAD) that is a collaboration of several countries and carriers to bring lightpath infrastructure to scientific users.
The closing plenary session saw presentations on the production SDN infrastructure being operated by AmLight (Jeronimo Bezerra, Florida International University); the experiences of the StarLight Software-Defined Networking Exchange (Joe Mambretti, StarLight International/National Communications Exchange); and on the Named Data Networking project (Ramiro Voicu, Caltech) which is running a testbed that changes traditional network paradigms by facilitating the fetching of data identified by a name from the network. This was followed by a presentation on deploying alien wavelengths between the US and Brazil (Chip Cox, AmLight).
This was followed by a lively panel session on the ‘Acquiring Subsea Spectrum' that was moderated by Erik-Jan Bos (NORDUnet) and featured Chip Cox (AmLight), Dale Finkelson (Internet2), Joe Mambretti (StarLight International/National Communications Exchange), David Wilde (AARNet), Rod Wilson (Ciena) and Charles Yun (REANNZ). This discussed the opportunities and challenges of acquiring and operating trans-oceanic lightpaths, an area of networking that has traditionally not been the purview of research and education networks but is increasingly looking to be a requirement as collaborative research acquires a global scope.
The workshop concluded with a closing address from GLIF Chair Kees Neggers (SURF) who thanked CESNET for hosting the workshop as well as Jan Gruntorad’s longstanding contributions to research and education networking over many years. He also announced that he would be standing down as the GLIF Chair as he had now retired from SURF and he felt the role needed someone who was actively involved in research and education networking forums. His successor still needed to be chosen by the community, but he hoped this would happen by the end of the year.
A motion was passed thanking Kees for leading GLIF since its inception, but also for his contributions to the wider development of the Internet. GLIF will continue though, with next year's 16th Annual Global LambdaGrid Workshop (GLIF 2016) being held on 29-30 September 2016 in Miami, USA, hosted by Internet2 and Florida International University and co-located with Internet2 Technology Exchange 2016.
The proceedings of the workshop are available at
About GLIF -- The Global Lambda Integrated Facility (GLIF) is an international virtual organisation of NRENs, consortia and institutions that promotes lambda networking. GLIF provides lambdas internationally as an integrated facility to support data-intensive scientific research, and supports middleware development for lambda networking. It brings together some of the world's premier networking engineers to develop an international infrastructure by identifying equipment, connection requirements, and necessary engineering functions and services. More information is available on the GLIF website at

New World Symphony at CineGrid Brasil 2014

CineGrid Brazil 2014 hosted a bleeding edge demonstration that put together in the same place three musicians from the New World Symphony orchestra and the CineGrid Brasil audience at University of São Paulo (USP). The New World Symphony performed live the soundtrack for and excerpt of fifteen minutes of the 1929 film São Paulo Sinfonia da Metrópole, by Adalberto Kemeny and Rodolpho Rex Lustig ( The performers played live at the New World Symphony in Miami and their image was transmitted in 4K along with a high definition audio system live to the Universidade de São Paulo Medical School Auditorium. The network bandwith that this experiment required something around 10 Gbps and the connectivity between Miami and São Paulo was provided by AMPATH and ANSP. More:

Cinema technologies have scientific applications

October 08, 2014
By Elton Alisson
Agência FAPESP – In late August, a group of Brazilian researchers transmitted a live version of a 15-minute-long 1929 São Paulo silent film, São Paulo, a Metropolitan Symphony, in 4K resolution (with image definition four times better than full HD television) from the main theater of the School of Medicine at the University of São Paulo (FMUSP) in downtown São Paulo to the New World Symphony concert hall in Miami.
Simultaneously, another group of researchers transmitted the same film, projected in the concert hall of the American orchestra, from Miami to São Paulo in real time – but its sound track was being played live by a trio of instrumentalists in surround sound (on 24 audio channels).
Photo: Agência FAPESPThe demonstration, made using a 10,000-km grid of underwater fiber optic cables between São Paulo and Miami and a connection speed of 10 gigabits per second (Gbps) is one of the high-definition image transmission technologies on ultra-fast networks that are the subject of research in the fields of cinema and digital media.
In addition to its use in the entertainment industry, the technology has been applied to various fields of science and in scientific communication and may help solve problems found with the academic and commercial Internet, say researchers who took part in CineGrid Brasil, an international conference held August 28–29, 2014 at the FMUSP theater.
“We believe that high-definition images will replace the 35 mm film used in movies up to now due to its quality and the potential for online real-time transmission,” noted Jane de Almeida, professor and researcher at the Laboratory of Cinematic Arts and Visualization (LabCine) of the Mackenzie Presbyterian University and one of the event coordinators, in comments to Agência FAPESP.
“The purpose of high-definition images is to enable an “expanded” cinema, one that extrapolates the traditional space of conventional movie theaters and allows images in high-resolution to be displayed in real-time in multipurpose spaces, with applications in fields like telemedicine, astronomy and microscopy,” said the researcher.
Laurin Herr, founder of CineGrid, said in a lecture at the event that the entertainment, art and culture, and science and technology sectors are driving digital media and that they all share the same needs. “The three fields need more speed and easier access to the Internet, in addition to better computers and equipment to store, distribute and visualize ever-larger amounts of data,” he said.
Evolution of the technology
According to the expert, initial attempts at digital cinema were made by the Japan Broadcasting Corporation (NHK) in the early 1980s.
In a 1981 conference on television and cinema engineering held in Los Angeles, researchers from the Japanese public radio broadcaster demonstrated an HDTV projector that piqued the interest of filmmakers such as Francis Ford Coppola, director of the Godfather trilogy and many other films. The technology, however, took more than 20 years to be developed by Nippon Telegraph and Telephone, a Japanese telecommunications company, which presented the first 4K digital cinema system to the world in 2001.
Since the early 2000s, however, major studios have begun to test a technology used to capture digital images called 2K, with a resolution of 2,048 x 1,080 pixels, slightly superior to HDTV, which provides images at a definition of 1,920 x 1,080 pixels.
Starting in 2006, studios began using 4K technology, which doubles the horizontal dimension to 4,096 x 2,160 pixels, becoming the second digital image format currently adopted by the industry, along with 2K.
“Today, 4K is not just a theory but one already found in movies, television, videogames, science and medicine,” said Herr.
“The best 4K resolution technology enables larger images with more detail and more immersion. At the movies, this allows the public to become more involved in the film. In the sciences, it allows researchers to better visualize a microorganism or a human organ, for example, at higher definition,” he said.
On August 29, 2014, the event’s final day, there was a live transmission of a cataract surgery conducted at the Opthamology Department of the Federal University of São Paulo (Unifesp) to the FMUSP theater using two 4K cameras attached to a microscope on the 10 Gbps ANSP (Academic Network at São Paulo, a FAPESP program) network.
In addition to showing the procedure in ultra-high resolution, the transmission technology allows more doctors in training to observe the details of the surgery, said Cícero Inácio da Silva, deputy coordinator of the Open University of Brazil (USB) at Unifesp.
“Generally, surgery such as this is observed by, at most, one student or medical resident, through what is known as ‘piggybacking’,” said Silva. “The 4K transmission of the procedure on a high-speed network allows an audience full of doctors in training to watch from an auditorium.”
In December, the researchers plan to transmit another ophthalmological surgery from São Paulo to Miami in 4K resolution, but this time in 3D.
Technological challenges
The goal of the experiments, in addition to demonstrating the viability of transmissions of large volumes of high-definition images to the research community, is to test the efficiency of the optical networks.
In countries such as the United States and Germany, these networks are already at 160 Gbps, equivalent to 160,000 times the average speed of broadband Internet in Brazil, at 2 Gbps. However, it is still difficult to transmit films in real-time due to problems such as delay (signal delay).
“Today, there is a 130-millisecond delay in data transmission via the fiber optic network between São Paulo and Miami, and a half-second delay between São Paulo and Japan,” said Luis Fernandez Lopez, general coordinator of ANSP. “In these two cases, there are physical problems that involve the speed of light in fiber optic cables, which is less than that in the vacuum and cannot be increased.”
The more serious problem in transmitting a 4K film over a high-speed network such as the one that connects São Paulo to Miami is that the film needs to be compressed into data packets of approximately 500 megabits per second – because a single image can have 8 megapixels (millions of pixels) – and decompressed upon arrival at the location where it is to be shown.
According to Lopez, the process of compression and decompression increases the delay and complicates the transmission problem. “If these 4K films could be transmitted without first having to be compressed, there would be less of a delay problem. So digital media professionals would like to have a 10 gigabit-per-second fiber optic network to transmit films in this image format without any transmission problems.”
According to Lopez, this same demand for faster and more efficient networks is shared by researchers in the fields of astronomy and particle physics.
By studying the problems related to quality control of the digital film transmission signal, solutions can be developed to improve the performance of academic networks.
“Assistance in conducting these demonstrations gives us at ANSP a huge advantage because they put us through the paces and prepare us to handle the demands of researchers from the state of São Paulo,” he said.
“When a particle physicist comes to us asking for a 10 gigabit per second link such as for CERN [European Organization for Nuclear Research, in Switzerland, which houses the Large Hadron Collider (LHC)], to perform experiments without any transmission failure or loss of data, we feel confident because we’ve already done it for the CineGrid,” he said.
Research community
This was the second time that the event was held in Brazil. The first was in 2011 in Rio de Janeiro. The event is organized in several countries by the non-profit international association CineGrid.
Established in the United States in 2004, the association was designed to constitute an interdisciplinary community focused on the investigation, development and demonstration of collaborative network tools that enable the production, use, preservation and exchange of ultra-high-quality digital media on high-speed fiber optic networks.
The association was conceived in the early 2000s, when the convergence of digital technology in the film industry began. Today, it boasts 50 members from around the world, including universities, research institutions, film studios, hardware and software developers and academic networks such as the ANSP.

Cinema promotes advances in scientific visualization

October 29, 2014
By Elton Alisson
Agência FAPESP – When American film director George Lucas wrote the screenplay for the first film in the Star Wars series in 1977, he planned to use computer graphics in one of the major scenes, in which the “Rebel Alliance” is presented with a plan of attack on the Death Star space station.
However, at the time, computer graphics were just beginning to be explored by special effects companies such as Industrial Light & Magic, which was founded by Lucas himself in 1975.
Photo-EVLThe technological solution for the scene was found at the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC) in the United States. At the time, researchers at the institution were developing a computer graphics system to teach molecular modeling to chemistry students. Using this system, they were able to create the three-dimensional animations Lucas had envisioned for the film.
“The same system that created scientific visualization images was used to do the special effects for the Star Wars movie,” said Maxine Brown, EVL director, in a lecture given during CineGrid Brasil, the international conference held on August 28-29, 2014, in the theater of the School of Medicine at the University of São Paulo (FMUSP).
“Larry Cuba, the artist hired to do the graphics used in the scene, came to the EVL and used our computer graphics hardware and software to create the presentation sequence for the plan of attack on the Death Star shown in the movie,” she noted.
This scientific visualization technology, developed by the institution for scientific purposes, is one of several such technologies that have ended up inspiring fiction and reaching movie screens.
On the other hand, computer visualization concepts that have been imagined and presented for the first time in movies have also led researchers from the institution to develop solutions for scientific purposes.
“Science influences the movies and vice-versa,” Brown said. “Sometimes, people see technologies that were developed in our laboratory that they thought were only found in science fiction movies. Conversely, a lot of what we see at the movies that is still science fiction inspires our scientists.”
The virtual reality environment “Holodeck,” presented for the first time in the Star Trek television series that premiered in 1987, led researchers in 1992 to develop the CAVE Automatic Virtual Environment (CAVE), a virtual reality projection system.
The “virtual cave” is a cube-shaped room in which sounds and images, which visitors can view in three dimensions by wearing stereoscopic glasses, are projected onto the room’s three walls and floor.
The user is able to explore the projected scene by moving around in the cube and using three control buttons to manipulate the three-dimensional objects.
“The CAVE was designed to be a useful tool for scientific visualization, and when it was introduced, they started calling it the Holodeck [from holography],” Brown explained. “It had several applications, such as in a project to virtually reconstruct the Harlem neighborhood [in New York City] during the 1920-1930 period.”
New version
In 2012, the EVL researchers released a new version of the digital cave, CAVE2, which is similar to the “war room” of the 1964 Stanley Kubrick movie Dr. Strangelove. This virtual reality environment is nearly 24 feet in diameter and 8 feet tall, and consists of a single curved wall with more than 70 liquid crystal (LCD) display panels (touch screens).
The room offers users a 320° panoramic view of high-resolution images that are projected on the wall of LCD touch screens at 37 megapixels (millions of pixels) in three dimensions or 74 megapixels in two dimensions.
The wall of screens can be used both to explore virtual reality simulations and to analyze large volumes of images placed side by side.
The images are viewed as a whole and manipulated using a visual data interactive exploration technology developed at the EVL over the past five years, through which users are able to touch the screen (as done with a smartphone) or move the data using gestures by means of a motion sensor, as in the 2002 Steven Spielberg science fiction movie Minority Report.
In the movie, the character played by US actor Tom Cruise uses special gloves and gestures to manipulate images, audio and other data files projected on a clear screen.
“The wall of screens in CAVE2 also allows the combining of images and data, so for example a group of researchers can project graphs relating to a single problem that they are attempting to solve to allow everyone to analyze them at once,” Brown said.
According to Brown, the hybrid virtual reality environment is being used on the EVL’s Batman Project, the name of which alludes to a scene from the 2008 Christopher Nolan movie The Dark Knight in which the character Lucius Fox, played by Morgan Freeman, monitors crimes committed in the fictitious city of Gotham on a curved wall of monitors.
The project is designed to display crime data for Chicago – high-crime areas, for example – to help police and decision makers develop more effective approaches to fighting crime.
“We use Google Maps to show the city of Chicago on the CAVE2 wall, superimposed with crime data,” Brown said. “This has allowed us to see several parts of the city at the same time and make comparisons regarding high-crime areas.”
Scientific applications
According to Brown, CAVE2 has also been used to view sets of complex scientific data, such as data from the Human Connectome Project.
Launched in 2009 by the National Institutes of Health (NIH), the Human Connectome Project is designed to identify and map the neural pathways that underlie adult human brain function.
Using CAVE2, psychiatry researchers from the UIC who are dedicated to the study of depression have analyzed neural network images produced by magnetic resonance equipment in a virtual reality environment.
“CAVE2 allows researchers and medical professionals to view data at a much more detailed level than ever before,” Brown said.
More recently, a group of researchers from NASA’s Astrobiology Science & Technology for Exploring Plants Program (ASTEP) have begun to use the virtual reality environment to assess the outcome of field tests on an unmanned underwater vehicle designed to explore the ice-covered surface of the moon Europa – one of four moons of the planet Jupiter.
Named Endurance, the robot was designed to navigate under the ice, collecting data and samples of microbial life, and to map the underwater environment for the production of three-dimensional maps.
To prepare for the Endurance mission, which is expected to take place after 2020, the researchers conducted a series of field tests in places such as Lake Bonney in Antarctica, which is permanently covered with ice.
The data collected by the robot in Antarctica were transmitted to the EVL, where they were used to generate three-dimensional images, maps and data representations of the lake.
The laboratory researchers then created a tool for the simultaneous visualization of hundreds of high-resolution georeferenced images of the layer of ice that covers the lake, which they can use to study the distribution of sediments trapped in the ice surface.
“By meeting in the virtual reality room, the engineers who designed the robot and the scientists involved in collecting data for the project are able to understand the problems each group has and to collectively study solutions,” Brown said.

More Articles...

  1. CineGrid Brasil Demonstrates Expanded Cinema
  2. CineGrid Brazil International Workshop 2014
  3. AmLight Consortium Research & Education Network Helps Transmit FIFA World Cup in 8K from Brazil to Japan
  4. CineGrid 2013: Coming to a Dental Office Near You?
  5. Advanced Software and Global Networks Stream 4K 3D Digital Movies from Poland to the U.S.
  6. CineGrid Brazil: First 4K Live streaming using JVC and FOGO Player
  7. CESNET Demonstrates Remote Film Cleaning at CineGrid 2012 Workshop
  8. CineGrid @ Amsterdam 2012
  9. CineGrid@TIFF 2012
  10. UCL (Universite catholique de Louvain) joins CineGrid
  11. SIGGRAPH Asia 2012 Submission Deadlines
  12. CineGrid 2012 Workshop Registration Open
  13. International Digital Cinema Microscopy Project Wins CENIC’s 2012 Innovations in Networking Award for Experimental/Developmental Applications
  14. CineGrid Announces: FGCS - Special issue out now
  15. International Conference on Audio Networking
  16. CineGrid @ Amsterdam 2011
  17. @ Rio 2011
  18. CineGrid Member Cyberport Announces First Hong Kong Steroscopic 3D Competition
  19. Net Works: An evening of telematic music
  20. CineGrid Receives 2011 CENIC Innovations in Networking Award
  21. CineGrid Receives 2011 CENIC Innovations in Networking Award
  22. CineGrid Takes Digital Cinema Into Next Dimension
  23. CineGrid @ TIFF 2010 (23rd annual Tokyo International Film Festival)
  24. AMIA and IASA
  25. UCSD Researchers Receive NSF Award to Support Data-Intensive Applications for Advanced Networks
  26. UCSD Researchers Receive NSF Award to Support Collaboration over Advanced Networks, including CineGrid
  27. Cyberport Becomes First CineGrid Member in Greater China Region - Press Release
  28. Cyberport becomes first CineGrid member in Greater China Region
  29. Cyberport and CineGrid Present Hong Kong's First 4K Live Streaming
  30. CineGrid's 4th Annual Workshop
  31. Laurin Herr Delivers Keynote at DELF 210
  32. CANARIE Provides High-Speed Research Network
  33. White House Initiative to Spur Student Innovations in Broadband
  34. CineGrid @ FILE 2009
  35. International Conference on Creating, Connecting and Collaborating through Computing (C5)
  36. World Opera Report - 2009
  37. CENIC AWARDS to Calit2 and USC
  38. Paul Hearty Recognized for Contributions
  39. Networked applications and technology demonstrated at CineGrid 2008 Workshop
  40. USCD Calit2 "Project Greenlight" wins CENIC award for Experimental/Developental Applications
  41. USC SCA "Alternate Endings" wins CENIC award for Educational Applications
  42. CCSIP Call for Proposals
  43. CineGrid Hosts Third Annual International Workshop at Calit2
  44. Ryerson and CineGrid Win the 2008 ORION Discovery Award
  45. DMC Institute / Keio University to Provide DCI Specification
  46. FILE Media Arts Festival features UCSD Art Installations and CineGrid 4K Cinema
  47. 2008 CineGrid International Workshop
  48. CineGrid @ Holland Festival 2007
  49. New GLIF Map Now Available
  50. CineGrid Featured in Keynote at 20th Anniversy SURFnet Relatiedagen
  51. CineGrid Presents at the NAB 2008 Digital Cinema Summit
  52. Image Essence will collaborate on advanced codec research with Calit2.
  53. CineGrid Featured in iSGTW
  54. CineGrid Exchange Open
  55. CineGrid Featured in Keynote at CITI Conference at Columbia University
  56. CineGrid Demonstrates International Networked Collaboration for 4K Motion Picture “Dailies”
  57. CineGrid Receives CENIC Innovations in Networking Award
  58. CineGrid Demonstrates International Networked Distribution of 4K Motion Pictures
  59. SIGGRAPH @ UC San Diego Offers Glimpse of Future in Super High Definition Video
  60. CineGrid Demonstrates Real-Time 4K Trans-Atlantic Streaming
  61. First Annual CineGrid International Workshop Held at Calit2
  62. Lucasfilm Hosts Audio Engineering Society for Calit2 CineGrid Special Event
  63. Independent Film Director Teams with Calit2 on ‘CineGrid’ Coast-to-Coast Screening of New High-Definition Movie over OptIPuter Backplane
  64. iGrid 2005 Receives CENIC Networking Innovation Award
  65. World’s First International Real-time Streaming of 4K Digital Cinema over Gigabit IP Optical Fiber Networks


Page 1 of 14