SmartGeometry 2011

1198 0

Now In its eighth year, SmartGeometry 2011, took place over six days this Spring, hosted in Copenhagen by the Royal Academy of Fine Arts School of Architecture’s Centre for IT and Architecture (CITA).

From its beginnings as a test bed for Bentley Systems’ GenerativeComponents (GC), the event has grown in stature and complexity adding an increasing amount of physical experimentation to the screen-based activities that one would expect at a computing conference.

The workshop on day four, Feel the Force’s sensory floor is on the left, centre stage being taken by Agent Construction’s sculptural construction. Image courtesy of Bentley Systems.

In contrast to the near heavy industrial atmosphere of last year’s event, electronics were at the forefront in Copenhagen. The heart of the event is the Workshop and CITA provided a magnificent former warehouse accommodating all of the workshop clusters in one space. As well as filling this space with laptops and other familiar hardware, mysterious perspex boxes full of brightly coloured wiring, crocodile clips and electronic components were visible throughout the workshop. One SmartGeometry veteran remarked, “SG 2011 was a shock to the system, for everybody who was there, but we have come to expect that, although we still marvel at it.”

‘Smart’ was more to the fore than ‘Geometry’ but that should be expected as being clever with geometry has become mainstream. Mere parametric manipulation of geometry is just part of the base skill set for this group.

The Kinect sensor mounted overhead alongside the projector detects hand gestures allowing the virtual model to be moved or modified interactively. The way in which software agents within the virtual environment move can be altered by gesture. Pictured Stefan Di Leo. Image courtesy of Bentley Systems.

No one could leave the workshop without jumping around on the Jedi-inspired ‘Use the Force’ sensory floor. The sensor data fed the Kangaroo physics plug-in for McNeel’s Grasshopper and GC generating real-time virtual geometry driven by the movement and pressure on the floor. This was projected alongside the installation showing participants how their movements modified the geometry. While this was great fun, serious applications could be to use similar sensor and software combinations to give real-time loading feedback of real or mock-up structures.

Advertisement
Advertisement

Collaboration

Since joining the directors of SmartGeometry, Xavier De Kestelier of Foster + Partners, and Shane Burger of Grimshaw have led a change in direction to a looser organisation and a reorganised event format, first seen last year in Barcelona where working in larger collaborative groups has become central to the workshop phase.

During 2010 an open invitation was issued for workshop topics and leaders. The responses overwhelmingly included “using real-world data”. Over 50 applications were whittled down to 12. A second invitation was then issued for workshop participants who were required to bring a high level of knowledge and ability to the cluster. This resulted in a carefully selected mix of 120 academics, practitioners and students, largely from an architectural or engineering background but also including representatives of other disciplines including software, maths and statistics, grouped into 12 ‘clusters’ working over a four-day period.

By day four the activity had moved from data gathering, analysis and processing to construction of installations that would be exhibited. To achieve this, work had been carrying on well into the small hours every night.

Building the invisible

On the face of it this year’s ‘Building the invisible’ theme could hardly be more different from SmartGeometry 2010, which was, literally, a more concrete event pushing the boundaries of direct fabrication in AEC. Despite the inclusion of ‘invisible’ in the event title the workshop participants were all required to produce physical installations for the end of workshop exhibition presenting the results of their activities. These displays ranged from a large-scale mock-up of a Gaudi inspired, acoustically diffusive ceiling, via knitting, to Xbox Kinect-driven interactive environmental simulations.

‘Building the invisible’ focused on gathering various kinds of data and exploring the possibilities offered by making this data integral to the design process. These ranged from evidently practical environmental and structural data to more abstract inputs, such as behavioural activities and the electrical sensitivity of building fabric. While there are a number of simulation tools in the market, many tend to be a step or two away from the heart of the design process and can take time to return results. One of the goals at this event was seeking simulation feedback in real-time to make it a more interactive part of the design process.

Talkshop

In addition to the exhibition installations, the knowledge gathered by the clusters contributed to the Talkshop day. This was structured around four roundtable sessions, an engaging format where cluster members presented topics to seed each session’s discussion. Common themes emerged as the day proceeded, the discussion being enlivened by questions and comments from the conference Twitter feed.

In session one, Data by Design, Bruno Moser of Foster + Partners, talked about data presentation and highlighted how distortions of projection or visualisation, whether intentional or otherwise affect how is data perceived, citing examples like the famously euro-centric Mercator projection.

Moderator Johnathan Rabagliati holds up one of the ASK devices during the Data by Design roundtable discussion. Image courtesy of Bentley Systems.

Ole Sigmund of the Technical University of Denmark reviewed developments in optimisation, a subject that has featured in previous SmartGeometry events and is now an accepted part of design processes in many fields. The example shown — minimising the weight of titanium in an A380 wing rib — is clearly a highly appropriate application, however “optimal is not necessarily beautiful.” Conventionally ribs are lightened by punching circular holes, however these punches can cause stress concentrations, the optimised ribs have a seemingly random arrangement of struts and spaces.

At present, optimisation can only assess a limited number of variables. Optimising for stress or other single variables is relatively straightforward. Participants were asked whether optimisation on a wider scale involving many variables should be sought. Some thought that the value of multi-variable optimisation may be questionable; its results would be heavily dependent on the starting assumptions and the relationships between them. The notion that optimisation could replace the judgement of designers provoked lively responses from the floor. One participant was heard to exclaim: “Architecture based on raw data is unlikely to happen!” Others argued that we must question the assumptions made by optimisation tools or other simulations.

In his presentation, Mr Sigmund had anticipated this response, showing a modified optimisation flowchart that included “Architect Satisfied?” as the final gate in the process. It seems the human designer is not being replaced quite yet.

An interesting distinction was made between simply gathering larger quantities of data and the improvement of data gathering and instrumentation techniques. Improved techniques increase the resolution of our data, a subtle but important difference. When all this numerical data is available will we be distracted by precision? The answer came “be as precise as you need to be but not over precise, sometimes knowing whether the result is 50 or 100 is enough”.

Session Two, entitled Form Follows Data, featured simulation methods currently in practice. Foster + Partners Giovanni Betti’s main theme was that a pretty image may not communicate any information. His comment that “data is worth nothing unless it makes its way to your gut feeling,” was echoed throughout the event. Many participants agreed with Mr Betti that “data does not equal information” and “data is worth nothing unless it makes its way to your gut feeling.”

Mr Betti posed the question: “How do we visualise”? In contrast to some of the previous colourful but uninformative illustrations, described by one audience member as ‘data porn’, he showed the Beaufort Scale website as an example of data visualisation that communicates.

The Beaufort Scale names and numbers are illustrated with images of actual conditions making a direct connection with the human experience calibrating gut feeling.

Session three picked up some of the earlier themes exploring the nature and value of data and provoked the most lively discussion of the day. Performative Data featured Daniel Piker and Robert Cervellione — developers of the Kangaroo plug-in for Bentley’s GC and McNeel’s Grasshopper, the generative modelling tool for Rhino.

Inspired by analogue form finding carried out by Gaudi, Frei Otto and others, Kangaroo was written to provide the kind of real-time interaction found in real world models, while seeking to be playful as well as useful. They highlighted the point that to be really useful simulation needs to offer results to designers in real-time.

Revisiting the earlier statement that data is not information, Flora Salim of RMIT Melbourne, talked of the importance of interpreting data to produce information that can be used by designers. She wondered whether the growth of data might produce career statisticians in the design world? Ms Salim concluded, “expert knowledge and intuition is needed when interpreting data. A statistician would produce stats but would they be useful?”

CITA’s hand crafted laser cut timber shell structure.

Jonathan Rabagliati, chairing the discussion remarked “it’s all about optimising gut feeling!”

The final session of the day entitled The Data Promise was anti-climactic and failed to put forward a cogent exposition of this ‘promise’. During discussion Volker Mueller, Research Director for Computational Design at Bentley Systems, asked: “Why do we want data?” It seems that the answer to this question varies according to context. In some cases more and better-interpreted data can clearly be of great value. In others data and the processes surrounding it, while interesting in themselves, may actually be a distraction.

There was a clear conclusion that all of these things are still only tools and that human designers and their ability to make decisions on ‘gut reaction’ will remain in charge.

Symposium

The Symposium held on the final day contained a series of keynotes and reports from all of the workshop clusters. Our host opened with examples of the work being carried out at CITA. GC has been extensively used in design projects from initial form finding to fabrication.

The experience gathered during these projects suggested that the interfaces to complex parametric tools need to have options for manual overrides to be applied allowing designs to be tweaked “by eye” in the traditional manner. Never mind the ghost in the machine, let’s have a designer!

Most interesting is CITA’s investigation of the use of craft techniques and traditions supported by computing technology and component fabrication, where the parametric model was not required to communicate construction detail, only to generate a visual model and component sizes, the detail being implicit in the chosen craft technique. A lesson for those involved in Building Information Modelling (BIM) processes: The most severe limitation of current research is that it generally only addresses single materials and working with the reality of mixed materials presents a higher level of complexity yet to be fully explored.

In other keynotes, Lisa Amini of IBM’s Smarter City Lab in Dublin highlighted how the Earth’s increasingly urban populations underlie many of the current issues affecting world cities. She showcased IBM projects where sensor-based data gathering is used to inform urban information networks to make a wide range of services smarter. Examples shown included studying energy performance management in over 1,400 New York City buildings and monitoring environmental conditions, pollution levels and local marine life in Galway Bay, Ireland

Showing a series of innovative projects Ben van Berkel of UN Studio described the extensive range of computing techniques used in his studio’s design processes as being at the stage of “a handcrafted digital project” and thinks “we will have to become programmers in the future.”

Usman Haque, CEO of Connected Environments Ltd showed a series of inspiring technology-driven public art projects and gave us the background to Pachube.com. Describing experimental projects where site micro-climate and building environmental monitoring had been carried out by the firm’s trans-disciplinary research group, Billie Faircloth of Kieran Timberlake warned us to look carefully at the results.

In one project a room’s temperature readings were way out of range, investigation revealed a steam main running directly below that the site investigation had missed.

Craig Schwitter of Buro Happold reviewed a series of projects in collaboration with Chuck Hoberman featuring sensor driven moveable elements that, unlike most conventional solar shading, respond to solar position and intensity. In the atrium pictured left, the internal facades are shaded but the atrium floor receives dappled sunlight adding an aesthetic dimension to the shading solution.

Conclusion

The most used word at SG 2011 was ‘data’. It is a word with a very specific meaning but also many casual and misleading usages. In common speech ‘data’ is often interchangeable with ‘information’.

One key lesson to take away from SG 2011 is that data does not equal information. We can gather as much data as we like but that data only becomes information once it has been processed, interpreted and then presented in a form that is comprehensible to the recipient.

This applies as equally to a pencil drawing on A4 paper as a seductive infographic.

The other key lesson was the importance of connections. SG 2011 utilised a rich mix of hardware sensors and scanners, parametric software, plug-ins, custom programming, linking/interpreting applications and output hardware. Without the connections to enable these to communicate together much of the workshop’s activity would have been impossible.

In the less esoteric world of AEC production both of these lessons are vitally important. We should not be distracted or seduced by the existence of massive amounts of data — we actually need the processed comprehensible information that arises from data. We also need to avoid being trapped in data silos and be able to utilise information from many diverse sources.

SmartGeometry is always an exciting event. Don’t be surprised if some of the experiments provoke technological developments that could be seen in the workplace soon. www.smartgeometry.org

This knitted skin has been custom made for the laser cut wooden supporting structure. Electrical conductors are an integral part of the knit. Image courtesy of Bentley Systems.

Clusters

Starting with the 2010 event, the workshop is organised around Clusters of 12 people to focus and apply more energy to each of the themes. At previous events participants worked in smaller teams or even individually, resulting in a more diffuse output.

Performing Skins

The Performing Skins cluster has taken some steps in researching the application of parametric techniques to multiple materials. Building on the active research of the cluster leaders, knitted fabric skins were shaped to match underlying timber structures, producing one of the most striking workshop installations.

Having resolved the challenge of interfacing the parametric design via floppy disk (just where do you buy a floppy disk these days?) with a 1980’s CNC knitting machine, and explored pattern making within the knit, electrically conductive threads were introduced into the fabrics. Responding via electromagnetic induction to human touch or changes in ambient humidity, these could react in various ways, changing the configuration of the fabric.

Knitting many separate circuits into the fabric allows spatial response/control. The once rare fabric structures that have now become ubiquitous largely follow a narrow range of forms dictated by the cutting and stitching possibilities available to flat woven fabric.

Knitting allows 3D form to be an integral part of the material’s creation. Could a future application for CNC knitting be the production of large scale fitted membranes or composite components containing wiring and smart sensors or active components?

Head of CITA Mette Ramsgard Thomsen with SmartGeometry co-founders Lars Hesselgren, J Parrish and Hugh Whitehead.

Urban Feeds

The Urban Feeds cluster chose to explore the ability of handheld devices to gather city scale data.

Proprietary devices such as the Libellium WASP node are commercially available, however they are preconfigured ‘black boxes’ and thus limited. The group built their own devices based on the open source Arduino processer. Branded ‘Ambient Sensor Kit’ (ASK) their units were assembled during the first day of the workshop from standard electronic parts and pre-cut clear acrylic enclosure kits, all of the electronics are intended to be tinkered with giving more control of the data capture. However ASK still contains assumptions that have to be understood and modified by feedback. When trying to understand what data is in this type of context, it is useful to realise that the data gathered by the ASK devices was a long list of resistances logged as a voltage. How these voltages are then interpreted is critical to the outcome.

Participants chose to capture four data streams: temperature; light levels; motion and CO2 levels, with each entry spatially logged by a GPS sensor. Data capture was carried out by the team walking around Copenhagen carrying the ASK units (with a pocket full of 9v batteries, power consumption was considerable). A real world application could involve many units in fixed locations gathering real-time data over longer periods. Back at the workshop the data was processed into 2D mapping and 3D forms overlaid on Google Earth and Streetmap via Pachube.com, a web-based service used to store, share and discover realtime sensor, energy and environment data from objects, devices and buildings around the world.

A step further in data visualisation was the generation of sculptural forms that dynamically visualised the data streams to direct people to parts of the city with the appropriate conditions for their needs — somewhere quiet and shady for hangover sufferers for instance.

Interacting with the city

In a darkened corner of the workshop this cluster built a scaffold frame to support a series of linked data projector and Microsoft’s Kinect sensor combinations. More Arduino processors linked some existing research threads to these sensory display installations.

‘Hacking the Kinect’ has been a recent techie headline, so use of the Kinect is attention grabbing. In reality ‘hacking’ fundamentally consisted of writing drivers to connect the Kinect to ordinary PCs instead of the Xbox. Once connected the point cloud data gathered by the Kinect can be accessed and used. Following the quest for real-time simulation the Kinect input enabled interaction with applications by hand gesture or in the case of ‘Ophelia’s Beach’ by placing block models into the display area.

The possibilities for physically interactive user interfaces demonstrated here should lead to products that we may see very soon. We are all becoming used to gesture-based interfaces on the small scale of our smartphone screens, the extension of this into our CAD systems is surely inevitable.

About the author: Marc Thomas is an independent project technology consultant (www.isisst.co.uk).

Advertisement

Leave a comment