SmartGeometry 2013

1071 0

SmartGeometry provides an environment in which to experiment with the latest technologies, from computational design to laser scanning. This year delivered its usual eclectic mix of workshops and talks, ranging from the practical to the surreal.

SmartGeometry could never be accused of being ‘just another event’. Forget papers submitted months in advance, peer reviewed and then presented, most of SmartGeometry happens on the fly through live workshops and group discussions. As a result, one never really knows what to expect. It is one of those rare events where practice meets academia, resulting in content that ranges from the practical to the esoteric. This year the roaming event returned to the UK, the place where it all started ten years ago. The Bartlett School of Architecture and the University of Central London (UCL) played hosts to the six-day event.

For 2013 the theme was ‘Constructing for Uncertainty’, which challenged participants to look into issues of computational ambiguity in materials, immaterial on both macro and micro scales.

Projections of Reality processed laser scan data in real time in order to project analysis data onto a physical urban model

If the environment is changing, how can designs adapt to this? If digital tools and fabrication are so accurate how do these work with the tolerances of onsite fabrication? How can sensing and scanning technologies assist design?

The workshops

The workshops are the heart and soul of SmartGeometry, each of which responded to the challenge ‘Constructing for Uncertainty’ in rather unique ways.

Clusters are all about experimentation, but not in the traditional sense where a hypotheses is tested in a controlled environment. Instead they provide an environment for the exchange of ideas, processes and techniques, designed to act as a catalyst for design resolution.

The ten workshop clusters, each including ten hand picked participants, spent the first four days of the event heads down burning the midnight oil. The findings were then presented to the conference at large on the final day. Here are some of the highlights.

(A)Synchronous Streams — This proved to be one of the more unusual workshops that we have seen to date. Using the new breed of low-cost micro controllers such as the Arduino and Raspberry Pi, the team developed remote sensing applications which would collect and relay data. These units were attached to low cost helium balloons and tethered to the ground.

Advertisement
Advertisement

The team developed ways to accurately gather location data and collect the measurements of details such as 3G strength, light, sound, temp, humidity pollution, wind etc, with a view to getting a much clearer picture of microclimate which, in industry, is often limited to the groundplane.

Adaptive Structural Skins was classic SmartGeometry: a team of structural engineers exploring how to form the load-bearing enclosure of new generation buildings through parametrisation, analysis, optimisation and form-finding. Rhino and Grasshopper was used for form finding while Generative Components helped design and validate the structural integrity using the STAAD plug-in.

Ten years of smart thinking

This year’s event marked the ten-year anniversary of SmartGeometry events, which have been hosted in many Northern Hemisphere cities over the years.

Back in 2001, the founders were frustrated with the lack of resources backing the solution of architectural design problems through computational methods. CAD (Computer Aided Design) had become just a digital documentation tool, which rarely used the computing power available.

The early architectural systems, such as Sonata, BDS and Rucaps offered much more than just a drawing board replacement but failed to survive the move to the desktop PC.

Along with research friends, Dr Robert Aish (Bentley Systems), Robert Woodbury (Simon Fraser University), Axel Kilian (Princeton University), Mark Burry (RMIT University) and Chris Williams (Bath University), the team set out to hold regular workshops to introduce and train the next generation of architects to better understand geometry, utilise computers, programming, parametrics and analysis to actually assist in the definition of smart architectural and structural designs driven by rules, that would respond to inputs/change.

These early workshops allowed Dr Robert Aish to drive Bentley Systems’ Generative Components (GC) scripting language to enable a new breed of digital ‘tool makers’, to allow complex design concepts to be explored using the power of computers. These workshops, or clusters, last for six days and attract around 300 participants, from the fields of construction research and leading firms.

Looking at the number of firms that now use computational design as part of their processes, one can easily see the impact that this movement has had.

In the beginning, Bentley’s GC and core programming through CAD APIs were the only game in town but since then there has also been an increase in scripting tools, with many firms using products such as McNeel Rhino with Grasshopper and Autodesk Maya. Computational tools are now available to the masses and complex forms are being added to the lexicon of all our cities.

The resulting origami-like geometry, which allows the structure to be configured in multiple states, was also tested through physical prototyping, using polypropylene, and bi-stable struts made of sections of aluminium tape measures.

Computer Vision & Freeform Construction was one of the most ambitious of the clusters. The aim was to develop an augmented reality system that would help non-skilled workers, perhaps those in developing countries, to build complex self-supporting thin-walled vaults.

The team used Generative Components, Grasshopper and Kangaroo (a computational physics engine for Rhino) to design the vaults then, with the help of an artisan tile / bricklayer Carlos Martin, constructed the vaults using foam glass and Catalonian bricks.

The system used a web camera, which was pointed at QR codes to check the position against the virtual model, accurate to within 10mm. While all this was going on, a series of Microsoft Kinect scanners captured the build in time lapse, leading to an eerie ghost like video. The Kinect images could theoretically be used to check the vault as built back against the computer generated form.

With Projections of Reality the goal was to create a working prototype of a physical urban model that is augmented with real time analysis.

One of the key challenges was working out how to adapt unstructured laser scanned point cloud data in real time so it could be understood by urban analysis software.

Generative Components was used to process the input of the raw scanned data taken from a Microsoft Kinect, so that it can be used with the various analytical applications. Four projectors then projected analytical imagery onto the physical mass models, which automatically updated as the models were moved around.

PAD (Probabalistic Architectural Design) addressed the challenge of Constructing for Uncertainty by coupling probability theory with parametric architectural and urban design.

By analysing urban areas, the cluster first explored the range of probable structure heights in relation to the capacity of adjacent road networks. Buildings were then optimised for sun exposure while existing within the constraints of what the road could support.

Thermal Reticulation was a test lab looking into the thermal efficiency of faade designs and exploring the difference between the simulated thermal performance of digital models compared to the real world thermal performance of actual constructed physical models.

Faades were designed computationally and simulated using software aimed both at Building efficiency (energy+) and heat flow (Ansys, Autodesk Multiphysics). Designs were then CNC machined or 3D printed and tested using a 500W bulb on the outside of a simple box building and multiple temperature sensors on the inside.

Talkshop and symposium

The sgTalkshop and symposium invites practitioners to reflect on topics, projects and theories. The four sessions were on:

Constructing data: Data & information, how do we understand and interpret data, how do we actuate on it?

Constructing for an evolving ecology: Climate change, policies and the city.

From virtual to actual:From screen to site. From human to robot. From multiple potentials to an actualised form.

Technology and the future culture: From open source computing to synthetic biology.

The from virtual to actual Talks shop featured Mark Burry (RMIT + Sagrada Familia), Daniel Bosia (AKT), Sara Klomps (Zaha Hadid Architects), Dirk Krolikowski (Rogers Stirk Harbour + Partners) and Bob Sheil (The Bartlett).

Two of the stand-out projects presented came from Mark Burry and Sara Klomps.

Burry, famous for his work on the Sagrada Familia, explained the design of an interior walling system for an office that, through the shape of its surfaces, would offer a beautiful and acoustically dead meeting room within an open plan office. Using computational methods and CNC manufacturing his team created a sculpted cone based wall which ensured sound stayed inside the space created by the complex sloping walls.

With the London 2012 restriction shackles off, firms can now talk about their Olympic designs. Klomps gave a great insight into how Zaha Hadid approached the design and manufacture of the diving boards in the London Aquatic Centre.

Using Rhino, the firm cleverly sculpted three diving boards of varying height using the same three sections. These were machined at real scale in foam, from which fiberglass moulds were made to cast the concrete. This was a great story of digital design and fabrication, showing the team’s excellent design flair. My only concern was that for known temporary structures, they looked like they were built to survive a direct nuclear hit.

In the debate following the presentations, a fairly black and white discussion took place on the place of 2D drawings in the design and construction process.

Some participants said that they have dispensed with 2D altogether, other than for contractual needs, while others such as Ms Klomps strongly disagreed. She said 2D had a valid place and had to regularly stop her team from modelling a problem, wasting time, when it could be sketched in 2D much more quickly.

The Symposium includes presentations from leading industry practitioners describing computational-based projects or thought leading research. This year SG had lined up: Tristram Carfrae (Arup), Michelle Addington (Yale University), Sarah Jane Pell (Artist & Researcher) and Ben Cerveny (Data Visualisation Designer).

A major coup for SmartGeometry was getting Tristram Carfrae, structural engineer, Arup to present. Carfrae has helped design some truly amazing and award winning buildings, such as the National Aquatics Centre (Beijing’s Watercube) and is one of the world’s leading experts in lightweight, long-span structures.

He explored a number of designs where generative methods were used to come up with a non-linear solution to deal with site restrictions, leading to a more interesting building that structurally as good, if not better than standard designs.

Conclusion

SmartGeometry has a relatively unrelenting schedule over three days and covers such a broad range of topics and ideas that it can leave one a little bewildered, with no time to truly absorb the complexities or far reaching impact of a presentation. It does not help that the event rarely runs to time.

Thankfully the Symposium and sgTalkshops will be available to view online in the coming weeks and can be taken with a little more digestion time.

At the start, ten years ago, SG workshops were producing little more than CAD spirals with arrays, or unravelling curved glazing schedules. Today’s teams were designing complex vaulted structures, 3D printing them, then assisting an artisan fabricate the form, while using 3D scanning technology to compare the built form against the original design. SmartGeometry and the technology we have at our fingertips has come a long, long way.

smartgeometry.org


Bentley turns to the cloud for simulation

Event sponsor Bentley Systems previewed a new technology that uses Generative Components and cloud-based simulation to explore hundreds of design iterations.

Santanu Das, Bentley’s senior vice-president for design and modelling, explained how studies have shown that most designs only undergo a few iterations. Time spent modelling and remodelling, setting up simulations, and processing bottlenecks are the big barriers to evaluating more alternatives.

Those wanting to do an acoustical analysis, heat analysis or a solar insulation analysis, for example, do not want to have to remodel every single time, explained Mr Das “[The model] should be smart enough to self describe itself and prepare itself for that type of simulation.”

Simulation is traditionally used for design verification, but Bentley sees a big opportunity for it to be deployed throughout the entire design process, using early stage conceptual models from the very beginning. The results aren’t spot on, but according to Das, you can get a big ‘red, yellow, green’ to find out where you are with your solar exposure, line of sight or shadows.

As the model progresses it should be able to be self-aware and provide feedback on how it is handling the different analytics it is exposed to, he said.

Simulation should not be a serial process of build > analyse > modify and then analyse again, emphasised Mr Das. “It should be integrated and it should be in real time and with Bentley’s cloud technologies this is all now possible.”

Mr Das was clear that the technology is not just about using the cloud to run more simulations. “You have to make the analytics talk to each other,” he said. “It’s what we call multi-disciplinary optimisations: the output of structural should influence the input of energy. The output of energy should influence the input of acoustics. We can’t work in silos anymore,” he stressed.

The technology is also about the automatic generation of the different analysis models, which Bentley calls scenario management.

“Using Bentley’s scenario management with our Generative Components we can optimise and automatically generate those models for you — by you giving the constraints and what kind of patterns that you want,” said Mr Das. “All of those models are created and uploaded onto our cloud and you can see we’ve designed them in real time and we compare them against many different scenarios.”

Mr Das gave an example where the system generated tens of options, but presented the results in such a way that they were easy to assess. In theory this could lead to 100s if not 1,000s of scenarios, helping make a building more efficient, both in terms of performance and cost.

To handle all the processing, Bentley has partnered with Microsoft and is using its Azure platform. The technology is not just reliant on multi-threaded CPUs though: Bentley showed a slide highlighting GPU compute using Nvidia CUDA.

Bentley’s cloud-based simulation technology will be launched later this year.

bentley.com

Advertisement

Leave a comment