Skip to main content
This paper assesses and explores the existing potential for new approaches to software and data, such as the open source movement, to contribute to more open and contestable strategic land-use and transport planning processes. Issues with... more
This paper assesses and explores the existing potential for new approaches to software and data, such as the open source movement, to contribute to more open and contestable strategic land-use and transport planning processes. Issues with the current state of ...
Building software which can deliver high performance consistently, across a range of different clusters, is a challenging exercise for developers as clusters come with specialized architectures and differing queuing policies and costs.... more
Building software which can deliver high performance consistently, across a range of different clusters, is a challenging exercise for developers as clusters come with specialized architectures and differing queuing policies and costs. Given that optimal code configuration for a particular model on any machine is difficult for developers and end-users alike to predict, we have developed a test which can provide instructions for optimal code configuration, is instantly comprehensible and does not bombard the user with technical details. This test is in the form of a 'personality type' resonant with users' everyday experience of colleagues in the workplace. A given cluster is deemed suitable for either development and or production and small/composite models and or large/complex ones. To help users of our software to choose an efficient configuration of the code, we convert the personality assessment result into a series of optimization instructions based on their cluster&...
Download (.pdf)
High-performance computing provides unprecedented capabilities to produce higher resolution 4-D models in a fraction of time. Thus, the need exists for a new generation of visualization systems able to main- tain parity with the enormous... more
High-performance computing provides unprecedented capabilities to produce higher resolution 4-D models in a fraction of time. Thus, the need exists for a new generation of visualization systems able to main- tain parity with the enormous volume of data generated. In attempting to write this much data to disk, each computational step introduces a significant performance bottleneck, yet most existing visualization software packages inherently rely on reading data in from a dump file. Available packages make this assumption of post- processing at quite a fundamental level and are not very well suited for plotting very large numbers of specialized particles. This necessitates the creation of a new visuali- zation system that meets the needs of large-scale geodynamic modeling. We have developed such a system, gLucifer, using a software framework approach that allows efficient reuse of our efforts in other areas of research. gLucifer is capable of producing movies of a 4-D data set ‘‘on the fly’’ (simultaneously with running the parallel scientific application) without creating a performance bottleneck. By eliminating most of the human efforts involved in visualizing results through postprocessing, gLucifer reconnects the scientist to the numerical experiment as it unfolds. Data sets that were previously very difficult to even manage may be efficiently explored and interrogated without writing to disk, and because this approach is based entirely on memory distributed across as many processors as are being utilized by the scientific application, the visualization solution is scalable into terabytes of data being rendered in real time.
Download (.pdf)
Why is it that though the free software movement largely sprung from academic computer labs in the 1980s, some fields of scientific research have been sluggish in developing quality open source community codes? We'll examine the causes of... more
Why is it that though the free software movement largely sprung from academic computer labs in the 1980s, some fields of scientific research have been sluggish in developing quality open source community codes? We'll examine the causes of this, and present an overview of the approaches we've employed to deal with them during the development of StGermain-to-Underworld, an open source scientific modelling framework.  While some of the issues discussed are specific to scientific computing, we hope many of them are broadly applicable to projects requiring a high degree of domain-specific knowledge and/or complex hardware and algorithms - such as games & mobile computing.
Download (.pdf)
Scientific research applications, or codes, are notoriously difficult to develop, use, and maintain. This is often because scientific software is written from scratch in traditional programming languages such as C and Fortran, by... more
Scientific research applications, or codes, are notoriously difficult to develop, use, and maintain. This is often because scientific software is written from scratch in traditional programming languages such as C and Fortran, by scientists rather than expert programmers. By contrast, modern commercial applications software is generally written using toolkits and software frameworks that allow new applications to be rapidly assembled from existing component libraries. In recent years, scientific software frameworks have started to appear, both for grid-enabling existing applications and for developing applications from scratch. This paper compares and contrasts existing scientific frameworks and extrapolates existing trends.
HPC scientific computational models are notoriously difficult to develop, debug, and maintain. The reasons for this are multifaceted — including difficulty of parallel programming, the lack of standard frameworks, and the lack of software... more
HPC scientific computational models are notoriously difficult to develop, debug, and maintain. The reasons for this are multifaceted — including difficulty of parallel programming, the lack of standard frameworks, and the lack of software engineering skills in scientific software developers.
In this paper we discuss the drivers, design and deployment of StGermain, a software framework that significantly simplifies the development of a spectrum of HPC computational mechanics models. The key distinction between StGermain and conventional ap- proaches to developing computational models is that StGermain decomposes parallel scientific applications into a hierarchical architecture, supporting applications collectively built by a diverse community of scientists, modelers, computational scientists, and software engineers.
Download (.pdf)
Maintaining and adapting scientific applications software is an ongoing issue for many researchers and communities, especially in domains such as geophysics, where community codes are constantly evolving to adopt new solution methods and... more
Maintaining and adapting scientific applications software is an ongoing issue for many researchers and communities, especially in domains such as geophysics, where community codes are constantly evolving to adopt new solution methods and constitutive laws. Traditional high performance computing code is written in C or Fortran, which offer high performance but are notoriously difficult to evolve and maintain. Object-oriented and interpretive programming lan- guages (such as C++, Java, and Python) offer better support for code evolution and maintenance, but have not been widely adopted for scientific programming, for reasons including their performance and/or complexity. This paper describes our approach to developing scientific codes in C that provides the flexibility of interpreted object- oriented environments with the performance of traditional C program- ming, through techniques including entry points, plug-ins, and coarse grained objects. This approach has been used to implement two very differently formulated scientific codes in active use and development by the geophysics scientific community.
Download (.pdf)
This paper reports on collaborative work carried out at Beyond Zero Emissions to analyse the impact of a metropolitan-wide re-design and upgrade of the greater metropolitan Melbourne region's bus network. The paper details both key... more
This paper reports on collaborative work carried out at Beyond Zero Emissions to analyse the impact of a metropolitan-wide re-design and upgrade of the greater metropolitan Melbourne region's bus network. The paper details both key results of the work, and the methodology used to arrive at them which utilised Open Source GIS-T tools. These include the calculation of a potential to reduce inter-modal public transport travel times for a selection of circa 38,000 trips between Melbourne's Travel Analysis Zones (TAZs) by 3.9 minutes (7.50%), using a similar number of buses to the existing timetable, given the assumptions in our simulation.

We discuss the key steps in the methodology used and workflow developed, which relied primarily on the QGIS GIS package, TransitFeed GTFS libraries, and the OpenTripPlanner journey planning and analysis tool. We outline the key datasets needed for the work such as OpenStreetMap street databases and GTFS timetables, and discuss the process of creating new GTFS timetables at differing service speeds and vehicle frequencies to create a 'virtual network'.

We comment on several possible improvements to this approach based on the project experience. This includes the potential for better retention of routing information from simulations, which would enhance the ability to do more advanced interpretation and optimisation of network design and capacity. We summarise the case that this workflow based on open-source tools, emerging open data standards such as GTFS, and the increasing commodification of cloud computing services, could be a step towards a new practice of public transport network informatics that can be carried out on a significantly more frequent and dynamic basis.
Research Interests:
We discuss and reflect on various aspects of using an online, Wiki - based platform - Appropedia – as part of an Action Research project into transport informatics software in civil society organisations. We first present a brief history... more
We discuss and reflect on various aspects of using an online, Wiki - based platform - Appropedia – as part of an Action Research project into transport informatics software in civil society organisations. We first present a brief history of and introduction to Appropedia, including a comparison with its much-larger 'cousin', Wikipedia– and a description of the way the site was used as part of the transport informatics Action Research project. We then reflect on and discuss several of the key issues and possible lessons arising from this work both from the perspective of the potential of online public knowledge-commons in Action Research, and also whether Appropedia could in future more explicitly cater for processes of deliberative democracy as well as socio-technical collaboration.
Download (.pdf)
How does society make decisions about highly complex policy domains such as transport, and what role should advanced information technology play in these processes? And in particular, how do we try to change policy directions in light of... more
How does society make decisions about highly complex policy domains such as transport, and what role should advanced information technology play in these processes? And in particular, how do we try to change policy directions in light of new concerns, contingencies and exigencies – among them that when viewed from a Sustainable Development perspective, societal approaches to transport in societies such as Australia are highly problematic, especially in terms of impacts on our global climate and demands placed on non-renewable resources?
To develop these concerns into a focused enquiry feasible to complete within a single PhD timeframe, the following paper outlines my research focusing on the role of Civil Society Organisations (CSOs) in the city of Melbourne's metropolitan-scale transport public policy debates, and their efforts in developing and advocating for alternative policy paradigms. In particular, I focus on the role and potential of Open Source Geographic Information Systems for Transport Informatics (GIS-TIs) as a knowledge technology in these organisations' work. I discuss the Action Research methodology employed, and the interpretive research paradigm employed, which is at the early stages of fieldwork with partner organisations.
Research Interests:
Download (.pdf)
With the requirements of scientific software being continually extended, the need for fast, scalable numerical solvers is always of concern. Geometric multigrid has been shown to provide theoretically linear scalability for a wide range... more
With the requirements of scientific software being continually extended, the need for fast, scalable numerical solvers is always of concern. Geometric multigrid has been shown to provide theoretically linear scalability for a wide range of mathematical problems, gaining popularity as an effective numerical solution technique. However, geometric multigrid is directly influenced by the size and parallel decomposition of a problem's spatial discretisation. The effect of one-dimensional and three-dimensional parallel spatial decompositions on a parallel multigrid solver is investigated in the context of the StGermain scientific software framework. Quantitative results are provided as a result of running a test problem on the APAC national facility.
Download (.pdf)
This paper describes the implementation of a parallel devide and conquer algorithm for delaunay triangulation. The algorithm operates by assigning a subset of data points to each processor, which then triangulate their local data points... more
This paper describes the implementation of a parallel devide and conquer algorithm for delaunay triangulation. The algorithm operates by assigning a subset of data points to each processor, which then triangulate their local data points followed by the merging of neighbouring triangulations. We outline the algorithm we have devised to dynamically identify the affected zone during the merge phase followed by experimental results which illustrate the degree of scalability achieved. The time complexity of the merge operation is O(n) as a result, the overall complexity of the algorithm is still O(n log n).
Download (.pdf)
The IUPS Physiome Project is an ambitious international effort to provide a computational and data framework for human and other physiology – from the structure and function of organs down to the level of cellular mechanisms. The Physiome... more
The IUPS Physiome Project is an ambitious international effort to provide a computational and data framework for human and other physiology – from the structure and function of organs down to the level of cellular mechanisms. The Physiome Project represents a particularly difficult and complex challenge for visualization, data management and computational models for e-Science. In this paper we introduce a tool we have built for the " Kidneyome " project – modeling the kidney physiome, including 3D visualization, model import and remote data access and computational model runs. We discuss the design, implementation, and use of the tool; and challenges faces in the project and the larger Physiome project.
Download (.pdf)
There are many attempts at libraries and frameworks targeted at reducing the difficulty and cost of developing computational codes. To satisfy this goal they typically aim to reduce the amount of code needed to be written by a single... more
There are many attempts at libraries and frameworks targeted at reducing the difficulty and cost of developing computational codes. To satisfy this goal they typically aim to reduce the amount of code needed to be written by a single scientist. Is it possible that we have out- grown this approach? The growing trend in successful research codes is the development of group and/or community efforts. In this case, individuals spanning departmental, financial and political borders contribute to the one greater effort and consequently leverage on a greater skills and experience base. Human labour is an expensive resource required for any software development, and so, for any consortium undertaking computational code development, it becomes important to address the way in which the community participates in such an activity. StGermain is a fundamental framework that attempts to eliminate specifically this problem. In this paper we discuss the issues of academic community coding efforts from a code maintainability point of view, and how StGermain provides programming constructs that address these issues.
Download (.pdf)
There are two ways to interpret a title such as " A Plug-in based design for code maintainability in HPC " , based on whether you are: through and through a HPC traditionalist, and then there is everybody else. But with realisation of... more
There are two ways to interpret a title such as " A Plug-in based design for code maintainability in HPC " , based on whether you are: through and through a HPC traditionalist, and then there is everybody else. But with realisation of computer software development cost, the commoditisation of clustering and the requirement of cross disciplinary science, scientific code evolution and maintenance for researchers is a real issue. This paper investigates what performance costs does one bare for the flexibility and maintainability of HPC software. If we can't be fully flexible, what can be? With all modern HPC platforms facilitating expected features such as dynamic libraries, concepts such as plugins can be considered. We explain how and where and why such concepts may be utilised. Then we offer two indicative examples of two very differently formulated scientific codes.
Download (.pdf)
Download (.pdf)
Download (.pdf)
Each discipline of geophysics has traditionally focused on limited sets of closely related phenomena using methodologies and data sets optimized for its specific area of interest. Why is that? Single discipline, single scale, foundation... more
Each discipline of geophysics has traditionally focused on limited sets of closely related phenomena using methodologies and data sets optimized for its specific area of interest. Why is that? Single discipline, single scale, foundation physics problems are relatively easy to code in Fortran, and hence they eventually become optimized for best performance whilst simultaneously becoming difficult to adapt to new
Download (.pdf)
Software is an integral part of many public processes designed to guide the evolution of our cities, such as decisions about transport infrastructure management and investments, and land use zoning. Much of the software used to predict... more
Software is an integral part of many public processes designed to guide the evolution of our cities, such as decisions about transport infrastructure management and investments, and land use zoning. Much of the software used to predict and analyse scenarios is currently proprietary :- and this has several problems regarding transparency, accountability, and ability for the public to participate in the process. Recently though a new generation of open source codes such as www.UrbanSim.org and www.MATSim.org have been launched and are in use overseas, and this talk will survey some of the most promising in this genre. Drawing on my PhD research, I'll discuss the current state of these open source modelling codes, the potential for them to be used as part of public processes and the positive changes this could bring about, and some of the software (and process) challenges in using a variety of different open source packages as part of public governance and decision-making. A particular issue addressed will be different strategies for interoperability of the multiple open-source packages needed to support real-world decision scenarios, such as metadata standards.

The contents of the talk will be based on a PhD programme Patrick is undertaking under the supervision of Prof. Marcus Wigan and others at the Australasian Centre for Governance and Management of Urban Transportation GAMUT at the University of Melbourne.
Download (.pdf)
One of the important directions in computational E-Research is the drive towards “Reproducible Research” [1]. Briefly, this involves a move to publishing all the data, tools and steps needed to support scientific results communicated in... more
One of the important directions in computational E-Research is the drive towards “Reproducible Research” [1]. Briefly, this involves a move to publishing all the data, tools and steps needed to support scientific results communicated in the literature, so they can be reproduced by other researchers. The key claimed benefits are increasing the rigour of research by allowing computational & domain scientists to test the claims of others in their field, and their productivity by avoiding the need to re-construct the steps and data needed to perform a particular analysis. While open-source software and data repositories go part-way towards providing this capability, as argued in [1] a means of recording and communicating the series of computational and analysis steps that produced a result are also needed. In recent years several software systems have developed to support such a goal of recording and re-running reproducible scientific workflows, such as Kepler and Taverna, both open-source systems [2].

In computational modelling of physical phenomena in fields such as climate science and geophysics, one of the key areas where this concept of reproducible research can be usefully applied is in the well-established tradition of development, assessment and communication of ‘benchmarks in the literature’. Here 'benchmarking' can mean evaluating any facet of scientific features, numerical accuracy, or computational performance in terms of time, memory or convergence. Benchmarking as an activity generally requires both considerable computation resources, and a multi-stage workflow involving assembling and configuring models, running them, then analyzing and post-processing their results in various ways. In it’s latter stages it is amenable to Grid computing, but requires considerable interactive development of the analysis that is generally performed on local resources.

Starting from these two concepts, this talk will report and reflect on an effort to apply them in practice for a computational community developing and using the Underworld parallel HPC Geophysics application ([3,4]). This effort has been embodied by a new Python toolkit called CREDO that enables the creation of benchmarking scripts, and enhancements to the Bitten continuous integration toolkit that hosts the group’s codebases and testing results. Underworld is co-developed by Monash University, VPAC and several external contributors, with primary funding currently provided by the AuScope NCRIS program. Underworld utilizes the StGermain computational framework [5,6], and the current effort extends and builds upon pervious iterations in a suite of unit testing, analysis and visualisation tools developed by the group [7,8].
Download (.pdf)
Sunter-ERA2010-ExtAbstract.pdf
Sunter_ERA10_ReproducibleBenchmarkingGeophysics.pdf
The radial return mapping algorithm within the computational context of a hybrid Finite Element and Particle-In-Cell (FE/PIC) method is constructed to allow a fluid flow FE/PIC code to be applied solid mechanic problems with large... more
The radial return mapping algorithm within the computational context of a hybrid Finite Element and Particle-In-Cell (FE/PIC) method is constructed to allow a fluid flow FE/PIC code to be applied solid mechanic problems with large displacements and large deformations. The FE/PIC ...
Download (.pdf)
When developing computational models of phenomena, physicists are concerned with both the general mathematical formulation of the problem, and the detailed physical parameters, and ideally can iteratively refine both over time. However... more
When developing computational models of phenomena, physicists are concerned with both the general mathematical formulation of the problem, and the detailed physical parameters, and ideally can iteratively refine both over time. However given the difficulty of writing parallel programs for high-performance computer architectures, time constraints often force the use of a pre-existing code and its associated formulation. This initial time saving is often offset by difficulties once the limitations of a given numerical method are reached. In this talk we present StGermain & Snark, parallel solver frameworks with a modular design which allow quickly changing both the mathematical formulation (e.g. incorporating Lagrangian integration points into the Finite Element Method), and the details of the problem being simulated (constitutive relationships, material types etc).
Download (.pdf)
Presentation to the 2005 Australian Computational Earth Systems Simulator (ACcESS) National Workshop
Download (.pdf)
In Australia the energy supplied by the stationary energy sector is mainly generated from non renewable sources such as coal, oil, petroleum fuels, natural gas or LPG (AGO 2006). In order to reduce greenhouse gas (GHG) emissions there... more
In Australia the energy supplied by the stationary energy sector is mainly generated from non renewable sources such as coal, oil, petroleum fuels, natural gas or LPG (AGO 2006). In order to reduce greenhouse gas (GHG) emissions there needs to be a significant focus on adopting alternative renewable sources of energy. Targets for renewable energy are being set internationally, nationally and at state level which is placing pressure on local governments and their communities to increase the use of energy from renewable sources.

This paper identifies the renewable energy resources that exist within the City of Onkaparinga Local Government Area (LGA) in greater Adelaide, South Australia. It reviews the methods that have been used to assess renewable energy at local level and uses the approach used for the renewable energy assessment for the City of Playford (in South Australia) which was first developed in the United Kingdom to identify the Resource Base, the Resource, and the Reserve for each of solar energy, wind energy and biomass energy in the case study area. GIS methods were used to identify solar and wind resources across the case study area. The results of the assessment indicated that there are significant solar, wind and biomass resources in the Onkaparinga LGA, and that current economic conditions would allow a significant solar resource to be substituted for the energy currently being provided from non-renewable sources.
This project is centred around the ideas of Danish architect and urban designer Jan Gehl, and whether his ideas for ‘people-centred’, ‘humanistic’ cities are relevant and possible to implement in Australian cities. Gehl’s ideas themselves... more
This project is centred around the ideas of Danish architect and urban designer Jan Gehl, and whether his ideas for ‘people-centred’, ‘humanistic’ cities are relevant and possible to implement in Australian cities. Gehl’s ideas themselves are critiqued and related to relevant concepts from urban design and planning, along with some of the barriers that mitigate their implementation. The issue is then further explored using a qualitative case study approach, looking at the cities of Adelaide and Melbourne, both state capitals that have employed the consultancy Gehl Architects in the last two decades to perform public life studies and make recommendations.

After presenting the case studies, the narrative of the two cities’ urban design policies in relation to Gehl’s ideas is discussed in an effort to draw out broader implications and suggest future avenues for research. This centres on the idea of the Melbourne study suggesting Gehl’s ideas can achieved given an appropriate institutional commitment to urban design at multiple levels of government, and a long-term strategy and adept incremental tactics to change public space. Reflections on the Adelaide case study are then made, suggesting that the 2002 report by Gehl was not embedded in an existing change process, but nevertheless usefully focused attention and criticism regarding the public realm. The potential for the report going forward is then assessed, particularly in the context of greater co-operation between levels of government, and the metropolitan Transit Oriented Development agenda in Adelaide.
Download (.pdf)
Download (.pdf)