Eventos

Frontiers of Engineering | Abstracts

Adrian Wilson

Study Manager
Foster Wheeler Energy Limited

Frontiers in Upstream Engineering

Touching on such subjects as:

  • Unconventionals -Heavy Oil, Shale/Tight Gas, CBM
  • Deepwater/Subsea - Cardon IV/UZ etc., Floating production systems
  • Arctic developments/Major pipelines
  • Major Sour Gas field developments, FW Patent

André Fujita

University of São Paulo – USP (Brazil)

Computational statistics in biological big data: methods and applications

The understanding of the biological mechanisms underlying human diseases is one of the main challenges in biological sciences. Although several efforts, the large number of heterogeneous factors that influence the genesis of a disease makes it a very hard task. One of the challenges consists in understanding diseases by developing methods to statistically analyze and computationally manipulate big data. This difficulty is generated by ultra large data size, heterogeneity, multidimensionality, and presence of intrinsic noise. In this context, I will present computationally intensive statistical methods developed by our group in the last three years and some applications in two different contexts: neuroscience and molecular biology. In neuroscience, the focus will be on the analysis of resting-state fMRI data of ~600 individuals diagnosed with Attention Deficit Hyperactivity Disorder (ADHD) and ~900 subjects diagnosed with Autism Spectrum Disorder (ASD). In the context of molecular biology, I will show partial results obtained from the study of miRNA expression of a cohort of ~2,000 breast cancer subjects.

Andrew C. Singer

New Insights into Microbial Function & Mechanisms for Bioremediation
Andrew C. Singer
NERC Centre for Ecology & Hydrology, Wallingford, OX10 8BB, UK
Bioremediation is widely acknowledged as being among the more cost effective solutions to cleaning up polluted soil and water. Despite being less expensive than many engineering and landfilling solutions, it has many limitations and barriers to success, with pollutant bioavailability being among the most important. Bioavailability dictates the amount of pollutant that is capable of being degraded by microbes?hence, if the microbe does not come into contact with the pollutant, it is not bioavailable and therefore cannot be biodegraded. Three factors are largely at work to control pollutant bioavailability: 1) clay and organic matter content of soil; 2) aging; 3) pollutant hydrophobicity. Pollutant concentrations must not be too concentrated that it is toxic to the pollutant-degrading microorganisms and not too dilute that microorganisms infrequently discover it. Pollutant concentration, however, can often be managed in such a way to keep it within the ?goldilocks? zone of not too little and not too much. Bioremediation relies on the growth and activity of microbes, hence, nutrient availability can be limiting and necessitate supplementation. Bioremediation is often achieved through the oxidation of organic chemicals, implicitly requiring the use of oxygen. If oxygen is limiting, anaerobic processes dominate, many of which are not entirely conducive to the rapid removal of many organic pollutants. It might go without saying that any bioremediation process will necessitate the presence of pollutant-degrading microorganisms. In practice, most environments contain a pollutant-degrader, however, the abundance of this/these organism(s) might be too low to achieve significant progress in bioremediation. Hence, these organisms will need to be enriched. Following on from this, microbial activity will be required. Metabolically active pollutant degraders might be all that is needed for the removal of some pollutants, however, more recalcitrant pollutants might require the presence of relatively uncommon genes to be present. These genes/microbes will need to be enriched in the zone of remediation to achieve the desired endpoint. Lastly, the genes responsible for catalyzing the pollutant removal may necessitate certain chemical or environmental triggers to initiate activity. In summary, pollutant + degrader does not necessarily result in bioremediation.

A real-world example of the widespread use of bioremediation is the activity that follows oil spills in the ocean, such as the ExxonMobil Valdez and Deepwater Horizon spills. Oil spills are often remediated through removing two main limitations: nutrients and bioavailability. Nutrient limitations are easily alleviated through the use of fertilizers, while bioavailability is improved through the use of surfactants, which break down the oil slick into ?bite-size? chunks, thereby increasing the surface area of the soil slick and greatly increasing the number of microorganisms that can grow on the oil, thereby hastening the rate of remediation.

There remains a fundamental question in all of bioremediation, and that is, why can you simply add nutrients and surfactants to oil slicks and find most of the pollutants rapidly degrade? Do the pollutants select for pollutant-degraders or have they always been there? If they were always there, how did they get there? Why can pollutant-degrading genes be recovered from unpolluted soil and water? I propose that the answer to this question is that the degraders and their genes were, for the most part, always there. The genes for pollutant degradation are the result of millions of years of co-evolution between plants, insects, protists, bacteria?essentially all of life on earth. All organisms communicate through the use of chemicals. It is the antagonistic nature of some chemicals, and the differential benefits afforded by some organisms from the use of some chemicals that starts both an arms race as well as motivation for the evolution of novel enzymes for chemical transformation and degradation. Recognition, transformation and detoxification of these semiochemicals is at the core of the evolution of enzymes in our world, and it is this long history of chemical communication that we can thank for the ubiquity of potential pollutant-degrading enzymes.

Several decades of research has examined and found support for the hypothesis that natural chemicals (e.g., terpenoids, flavanoids) can induce pollutant-degradation (e.g., polychlorinated biphenyl). However, much of this research was conducted without knowledge of the specific pollutant degrader and/or pollutant degrading gene. However, modern molecular microbiology has developed at such a pace in recent years that can allow for not only identification of isolated microbes and their gene activity, but also the rapid survey of an entire landscape to determine the species of microorganisms that are present, estimates of their abundance and activity. The use of ?next-generation sequencing? has enabled the generation of sequence data on a scale that seemed unimaginable only a decade ago, with 100 gigabases of sequence data (i.e., trillions of base pairs) being generated from a single run on some of the most cutting edge high-throughput sequencing machines (e.g., Illumina Hiseq X Ten). Alongside the development of the sequencing approaches has been the need for large dataset handling and mining. Bioinformatics is used to interrogate DNA, RNA and protein data from complex soil and water microbial communities to the isolated single species. The avalanche of data now possible enables the researcher to ask questions that were previously impossible to answer at a scale that would have been impossible to conceive or finance before. It is through the use of these novel molecular approaches that the remediation scientist and engineer can better predict, monitor and alter the efficacy of a bioremediation approach. The insights provided by the molecular revolution enables the researcher and practitioner to manipulate microbial communities in much more deterministic ways, making bioremediation less of an art and more of a science.

André Teófilo Beck

PhD University of Newcastle, NSW, Australia, 2004.

UNCERTAINTY QUANTIFICATION AND RISK ANALYSIS IN ENGINEERING – WITH APPLICATIONS TO OIL & GAS

Current global climate trends, perceived trough higher ocean levels and higher temperatures, are leading to more intense and more frequent extreme weather events. This, combined with the increase in urbanization and in coastal development, has the potential to drastically increase consequences of extreme weather and other natural hazards. A major hurricane hit the coast of Santa Catarina in 2004, for the first time on record, with winds of up to 180 km/h. Five hurricanes that hit the Gulf of Mexico, between 2004 and 2008, where stronger than anything previously recorded, in over 50 years of offshore exploration in the Gulf area. As a consequence, design loads for fixed offshore structures increased by nearly 50%. As we approach times of higher uncertainties, regulations on engineering facilities are becoming stricter, with mandatory risk analysis for all offshore and most onshore Oil & Gas facilities. The profession is learning that absolute safety is an utopia, hence consequences of failure have to be explicitly addressed in engineering design. These recent challenges have also led to the development of new engineering design concepts, such as infra-structure robustness and resilience, and performance-based design. Addressing the challenges above involves uncertainty quantification, reliability and risk analysis. These issues should drive future research efforts in the Oil & Gas and other industries. Addressing these problems will require a change in the deterministic way that the engineering profession is perceived, which should also drive changes in engineering curricula. The talk will also address Oil & Gas problems with an interface to Smart Grids, Big Data and Bioremediation.

Bruno Karolski

University of São Paulo - USP (Brazil)

Acid water, an effluent produced by fluid catalytic cracking units (FCC) cannot be destinated to ordinary industrial wastewater treatment due to several toxic contaminants such phenols, H2S, NH3, HCN, mercaptans, hydrocarbons and other environment contaminant substances that can be found in minor proportions. Due to higher concentrations of toxic contaminants present in this wastewater, a special and dedicated treatment must be applied to the acid water before its disposal into environment. This project developed a treatment for acid water based on biodegradation activity of previously isolated bacteria from contaminated sites. With only two steps, we could reduce the contaminant’s concentration to accepted levels for disposal and even to water reuse in the production line.

Participant – Bioremediation (BR)

Elen Aquino Perpetuo

Federal University of São Paulo ? Unifesp (Brazil)

Bioprospection of microorganisms from contaminated sites

Our planet hosts many different environments. There are oceans, deserts, rainforests and many other places where different forms of life can be found. Not all organisms can adapt and/or survive in diverse environments, but, instead, they inhabit specific environments according to their abiotic and biotic characteristics. In fact, microorganisms are key components of biogeochemical cycles and maintenance of life through symbiotic relationships. Collectively, microorganisms have a great metabolic diversity, which allows their ubiquity. Because of their ubiquitous nature, the biotechnological potential of microorganisms is huge, with many possible applications. One of these applications is the utilization of microorganisms or their enzymes in bioremediation approaches. Their organic degrading or biosorbent capabilities have emerged as an alternative for sustainable treatment of environmental passives. For this approach, these microorganisms are isolated from samples of contaminated sites and submitted to enrichment in selective media. Thus, the isolates considered as the most efficient ones are identified based on MALDI-TOF mass spectrometry technique and/ or partial sequencing of the 16S rRNA gene. Finally, their metabolization capability of different carbon sources, growth capacity and tolerance, biosorbent capability are determined. Moreover, the knowledge and use of these isolates can be promising for bioremediation processes (bioaugmentation) of contaminated sites such as petrochemical residues or metallic wastewaters.

Speaker ? Bioremediation (BR)

Erick de Moraes Franklin

PhD Université de Toulouse III, France, 2008.

SEDIMENT TRANSPORT AND MORPHODYNAMICS

The transport of granular matter by a fluid flow is frequently found in both nature and industry. It is present, for example, in the erosion of river banks, in the displacement of desert dunes and in hydrocarbon pipelines conveying sand. When the shear stresses exerted by a fluid flow on a granular bed are bounded to some limits, some grains are entrained without fluidizing the bed. A moving granular layer, known as bed load, takes place in which the grains stay in contact with the fixed part of the bed. Under these conditions, an initially flat granular bed may become unstable, generating ripples and dunes. In the case of rivers, these forms create supplementary friction between the bed and the water, affecting the water depth and being related to flood and navigation problems. In hydrocarbon pipelines, bedforms generate supplementary pressure loss, pressure fluctuations and flow rate transients. Although of importance for many scientific domains, bed load and bed instabilities are not well understood. On the one hand, the grains are entrained by the fluid flow, forming a moving granular layer. On the other hand, the moving granular layer changes the fluid flow due to momentum transfers. Moreover, local variations of the fluid flow cause local erosion and deposition on the granular bed, giving rise to ripples and dunes. In their turn, ripples and dunes perturb the fluid flow, changing erosion and deposition rates. This presentation will be devoted to this intricate problem.

Giulio Napolitano

A high performance, machine learning and ontology-assisted system for information extraction, analysis and cancer registration from free-text surgical pathology reports

Presenting: Giulio Napolitano1, Darragh McConville

Affiliations: Northern Ireland Cancer Registry, Centre for Public Health, Queen?s University Belfast; 2Centre for Statistical Science and Operational Research, School of Mathematics and Physics, Queen?s University Belfast; Kainos, Belfast.

Background: The primary repositories of epidemiological data on cancer are the cancer registries: their main objective is to collect information on cancer cases and produce relevant statistics as outputs. Surgical pathology reports are considered the most accurate source of information on a patient?s cancer. Their narrative or almost-narrative form, however, makes the information contained difficult to read by a machine and requires visual inspection in almost all scenarios of information extraction. This is resource-intensive for registries: in Northern Ireland, the Cancer Registry receives over 30,000 pathology reports every year.

Objectives: Our objective was to enhance not only those scenarios, by means of an ontology-based approach, but also to enhance and extend the approach further, in an attempt to design a complete document-to-registration system. The information extraction process was also to be automated, where possible, to enable a shift in registry focus to information analysis as opposed to withdrawal.

Methods: A number of tools and technologies were used: an ontology, that is a formal specification of the concepts of relevance in the domain and their relations (the ?semantics?), which is both human and machine readable; the GATE framework, an open source environment and set of tools for the processing of natural language documents; machine learning algorithms, in our case computer programs that learn from examples to classify portions of text; algorithms to process large and distributed datasets in parallel, following the MapReduce framework; the open source enterprise search engine Apache Solr. Machine learning techniques were integrated to design a prototype system capable of extracting information from narrative or semistructured surgical pathology reports, determining whether the stated information is negated or not relevant to the current status of the patient, using the extracted information to infer staging classification, where possible, and finally presenting the extracted and inferred information in a standard format ready for input into cancer registration systems. The prototype was built using the MapReduce processing framework, enabling parallel information extraction with an embedded machine learning classifier. Apache Solr search was then used to index and perform parallel search on the extracted and inferred pathology report information.

Results: The system was able to feed information into a rapid, ontology-aware report extraction and search system, which empowered users with the ability to gain specific field statistics and distinct report overviews. The individual modules and subsystems have been developed and tested. We are now in the process of formally evaluating the whole system on a set of breast cancer pathology reports produced in Northern Ireland from 2006 to 2013. Quality will be initially assessed in terms of sensitivity/specificity and precision/recall while performance will be evaluated in terms of human time saved.

Conclusions: The integration of diverse approaches, such as semantic-aware techniques, machine learning and natural language processing, are likely to prove more and more beneficial to the cancer registration of the future. Furthermore, automating the process of information extraction and inference, with MapReduce, in a high performance environment and combined with a distributed Apache Solr report search, should result in further relaxation of registry resources. This integration will provide increasing support to the automation of cancer registries, liberating staff?s valuable time for other tasks.

Acknowledgements: The NICR is funded by the Public Health Agency, NI.

Hector Keun

Exploring and exploiting the relationships between metabolism and disease using metabolomics and systems medicine

Dr Hector Keun ? Senior Lecturer, Department of Surgery & Cancer, Imperial College London, UK

Modern molecular medicine is generating data at an unprecedented rate, with large initiatives ongoing in the UK, Europe and globally set to provide new breadth and depth to the genetic and epigenetic characterization of complex diseases such as cancer. The study of metabolism is also undergoing a revolution, as the field of metabolomics becomes established. This seeks, in an analogous manner to genomics, to define in a global manner the metabolic composition of a biological specimen or even an entire organism. Metabolomics offers a new way to understand how metabolic dysregulation can cause - or be caused by - disease, potentially leading to faster and more accurate diagnosis or prognosis, as well as new treatments. However the accurate interpretation of metabolomic data is subject to numerous challenges that result in part from the chemical diversity and dynamics of the metabolome, and also from the complex and poorly characterized interactions between an organism?s genome, its metabolome and its environment. Hence metabolomics has from the outset been inherently dependent on the successful application of ?big data? processing and pattern recognition techniques as well as metabolic reaction network modelling. In this presentation I will give an overview of the data analytic and modeling challenges facing metabolomics and its integration with other ?-omics? techniques (?systems medicine?), with particular reference to the goal of improving the cancer patient journey.

Lateef Akanji

University of Aberdeen, UK

Shale Fracking and Well Production Enhancement Potentials

The current increase in global energy demand coupled with the depletion of conventional hydrocarbon reserves is imposing a formidable challenge for the efficient recovery from unconventional shale reservoirs. However, the recent discovery of new shale plays and the evolution of the technologies to explore and exploit them have revolutionised the oil and gas sector. Despite the discovery of these new plays, the industry is still poised to meet the technical and socio-economic challenges required in the efficient, safe and cost effective exploitation of the resources.

In this talk, I will highlight issues relating to energy sustainability and the potentials of shale plays in meeting global energy demand. A case study of typical shale fracking operation looking into the geo-mechanics of fracture propagation and insitu stress states will also be discussed. The hydraulically fractured reservoir consisted of a multi-layered formation containing 7 geological pay zone layers of shale and dirty sandstone with a total depth of 10,200ft. The effect of in-situ stress contrast between the pay zone and the bounding layers on fracture characteristics such as net pressure, height, width, length and volume of gas produced was investigated.

The results of the numerical simulation indicated that the in-situ stress contrast is a function of the fracture geometry; height, length and width, as well as the fracture net pressure. Furthermore, in a high in-situ stress contrast situation, the toughness of the underlying bounding layer contributed significantly to the fracture pressure. This in turn affected the production potentials from each of the layers bounding the targeted reservoirs with a strong tendency to cut down on operational cost when single layers are fractured as opposed to simultaneous production from multi-layered perforations. The implications of the fracturing operation on the environment were also discussed.

Luis(Nando) Ochoa

Towards Smart Low-Carbon Electricity Networks

The increasing and future adoption of small-to-medium scale low carbon technologies such as wind power, photovoltaic systems, electric vehicles and electric heat pumps is and will pose significant technical and economic challenges particularly on distribution networks. Medium (1kV-100kV) and low voltage (<1kV) distribution networks have been designed to have limited or no controllability and hence are largely unmonitored. However, it is likely that they will become one of the first bottlenecks towards the decarbonisation of our power systems.

This presentation will first introduce the potential problems or impacts of different penetrations of low carbon technologies on LV networks. It will then present and discuss the benefits and drawbacks of some of the potential solutions that might allow higher penetrations without the need of traditional reinforcements. Potential control solutions of MV networks and corresponding aspects will also be presented. The presentation will then address the larger picture of the integration of LV-MV control schemes as well as the many other aspects related to Smart Grids. Finally, a much broader future context of the electricity infrastructure within cities (Smart Cities) and other energy vectors (Smart Energy Systems) will be discussed.

Maíra Martins da Silva

University of São Paulo ? USP São Carlos (Brazil)

Pôster: Tackling multi-physics engineering problems via optimization approaches Maíra Martins da Silva

It is undeniable that designing a good product/process has become a real challenge. We are facing the traditional requirements for performance, robustness, time to market and competitive price are strained by the demands on product personalization and ecological, safety and legislation aspects. In fact, the products are becoming highly complex, relying on active and semi-active components working concurrently with a large range of technologies. Two massive issues can be identified when dealing with these requirements: (i) the use and extension of multi-physics simulation to cope with the multi-disciplinary nature of these products and (ii) the use of optimization to improve the product qualities. To overcome the former issue, a short discussion on modeling mechatronic systems is raised. Nevertheless, the main objective of this presentation is to tackle the later issue. In order to cope with that, distinct optimization approaches are exploited. One may realize that this vast research topic can be employed in several fields of engineering.

Session Oil and Gas

Marcel Parolin Jackowski

University of São Paulo – USP (Brazil)

A cloud-based medical image exploration and analysis platform

The development of public healthcare policies that are both effective and affordable requires governments to fluidly quantify and understand health statistics of their given populations. The analysis of medical related data has the potential to portray overall population health as well as improve health care policies. The increasing use of medical images for clinical as well as biomedical research, however, generates a vast amount of data to be processed and analyzed such that individual computers or even local clusters cannot cope in a timely manner. Here we will present an experimental cloud-based platform that allows for the smooth execution of computational-expensive operations on large sets of images. It also makes it possible for research institutes, universities and hospitals around the world to use the same image processing and analysis pipelines, thereby openly sharing results and medical knowledge with ease. This will facilitate the dissemination of medical knowledge generated by different investigations while providing a means for comparing and sharing inter-institutional findings. Having the ability to extrapolate and analyze data quickly will allow for scientific progress to outpace the progression and mutation of many diseases.

Márcio Venício Pilar Alcântara

Regulation Specialist, Aneel (Brazilian Electricity Regulatory Agency)
Superintendence of R&D and Energy Efficiency

Regulatory Issues and R&D Projects Opportunities for Implementation of Brazilian Smart Grid

The Agenda is to present the Brazilian key drivers to smart grid, including regulation actions to smart grid that the regulator is taking, R&D projects in smart grid that is in development and some challenges and perspectives in this issue.

Dr Michael J. Spence

Concawe, Brussels (on secondment from Shell Projects and Technology, UK)
In-situ Bioremediation: Opportunities, Technical Challenges and Innovations

Bioremediation is the use of an in-situ or introduced microbial consortium to perform chemical transformations that improve the utility of land and/or water resources. Applications of bioremediation are many and varied, ranging from biological transformation of undesirable chemicals, to the immobilisation of metals, to the modification of material properties.

Traditional remedial techniques for sediments and groundwater, such as excavation to landfill or groundwater remediation by pump and treat are resource intensive and may have significant societal impacts. Where correctly applied, in-situ bioremediation is a cost- effective, sustainable alternative. In-situ bioremediation may be passive in the sense that the process takes place naturally, or enhanced, whereby measures are taken to change or increase the rate of microbial processes (e.g. injection of fluids containing oxidants and/or nutrients).

Subsurface applications of bioremediation face a technological challenge common to all in-situ remedial approaches, namely the need to address uncertainties resulting from subsurface heterogeneity. The primary cause of heterogeneity is lithological variation, leading to variation in porosity, permeability, sediment composition and fluid flow. The effects of heterogeneity on fluid flow are hard to predict and so pilot testing is usually carried out to verify that injected fluids reach the treatment zone. Heterogeneity is also a key consideration in the design of the monitoring systems required to validate the performance of bioremediation.

An additional factor in the deployment of bioremediation is uncertainty around the ?operational envelope? of the microbial population. In recent years the reduced cost of DNA sequencing has made visible the 99% of environmental micro-organisms that cannot be cultured in the laboratory. With accurate benchmarking of microbial communities now possible across different sites, the research challenge is to understand the highly complex interaction between the community and its environment. Understanding how micro-organisms complete chemical reaction pathways by facilitating the transfer of nutrients, electrons and redox- sensitive ions is a critical element. For example, organisms in biofilms may grow conductive micro-filaments to transfer electrons to electron accepting species across a redox gradient. The complexity of interactions at the microbial community level means that for most practical applications the microbial community is still treated as a ?black box?, with tests undertaken at laboratory and field- scale to determine the dependence of biodegradation rate on environmental factors such as pH, electron acceptor and substrate concentrations. A wide variety of monitoring technologies are now available to prove in-situ biodegradation potential, including stable isotope probing (SIP), compound specific stable isotope analysis (CSIA) and DNA/RNA- based methods such as gene-probe assays and metagenomics.

A case study is presented of hydrocarbon bioremediation at an operational refinery by pulsed- oxygen- injection. The lines of evidence used to demonstrate the biotransformation potential of the subsurface microbial community are described, together with measures taken to overcome difficulties caused by subsurface heterogeneity. From an engineering standpoint, the actions undertaken to understand and mitigate risks associated with subsurface oxygen injection are also addressed.

René Peter Schneider

Department of Chemical Engineering
University of São Paulo

Advanced Tools to Assess Bioremediation

The traditional techniques available for assessing the potential for implementation of bioremediation solutions at contaminated sites are time-consuming since they depend on laboratory microcosm incubations and ex-situ chemical analysis of target contaminants, metabolites and other essential nutrients, such as terminal electron acceptors. Because of the difficulty in simulating accurately field conditions in the laboratory, kinetic data from such experiments is rarely relevant for the field. Expediting bioremediation evaluation and optimization requires new approaches, which provide field-relevant data in shorter time. Over the past 10 years, the USEPA advanced the Triad concept for contaminated site assessment, which is based on three essential elements: systematic project planning, dynamic work strategies and real-time measurement techniques. Several avenues for incorporating novel bioremediation assessment protocols into the triad approach under investigation at CEPEMA will be discussed in the presentation.

PARTICIPANT PROFILES

PROGRAMME

ABSTRACTS

POSTER SESSIONS

HOTEL PARADIES - HOW TO GET THERE


Página atualizada em 27/10/2022 - Publicada em 29/10/2014