976 resultados para Scientific experiments
Resumo:
As scientific workflows and the data they operate on, grow in size and complexity, the task of defining how those workflows should execute (which resources to use, where the resources must be in readiness for processing etc.) becomes proportionally more difficult. While "workflow compilers", such as Pegasus, reduce this burden, a further problem arises: since specifying details of execution is now automatic, a workflow's results are harder to interpret, as they are partly due to specifics of execution. By automating steps between the experiment design and its results, we lose the connection between them, hindering interpretation of results. To reconnect the scientific data with the original experiment, we argue that scientists should have access to the full provenance of their data, including not only parameters, inputs and intermediary data, but also the abstract experiment, refined into a concrete execution by the "workflow compiler". In this paper, we describe preliminary work on adapting Pegasus to capture the process of workflow refinement in the PASOA provenance system.
Resumo:
Mode of access: Internet.
Resumo:
Mode of access: Internet.
Resumo:
La reproducibilidad de estudios y resultados científicos es una meta a tener en cuenta por cualquier científico a la hora de publicar el producto de una investigación. El auge de la ciencia computacional, como una forma de llevar a cabo estudios empíricos haciendo uso de modelos matemáticos y simulaciones, ha derivado en una serie de nuevos retos con respecto a la reproducibilidad de dichos experimentos. La adopción de los flujos de trabajo como método para especificar el procedimiento científico de estos experimentos, así como las iniciativas orientadas a la conservación de los datos experimentales desarrolladas en las últimas décadas, han solucionado parcialmente este problema. Sin embargo, para afrontarlo de forma completa, la conservación y reproducibilidad del equipamiento computacional asociado a los flujos de trabajo científicos deben ser tenidas en cuenta. La amplia gama de recursos hardware y software necesarios para ejecutar un flujo de trabajo científico hace que sea necesario aportar una descripción completa detallando que recursos son necesarios y como estos deben de ser configurados. En esta tesis abordamos la reproducibilidad de los entornos de ejecución para flujos de trabajo científicos, mediante su documentación usando un modelo formal que puede ser usado para obtener un entorno equivalente. Para ello, se ha propuesto un conjunto de modelos para representar y relacionar los conceptos relevantes de dichos entornos, así como un conjunto de herramientas que hacen uso de dichos módulos para generar una descripción de la infraestructura, y un algoritmo capaz de generar una nueva especificación de entorno de ejecución a partir de dicha descripción, la cual puede ser usada para recrearlo usando técnicas de virtualización. Estas contribuciones han sido aplicadas a un conjunto representativo de experimentos científicos pertenecientes a diferentes dominios de la ciencia, exponiendo cada uno de ellos diferentes requisitos hardware y software. Los resultados obtenidos muestran la viabilidad de propuesta desarrollada, reproduciendo de forma satisfactoria los experimentos estudiados en diferentes entornos de virtualización. ABSTRACT Reproducibility of scientific studies and results is a goal that every scientist must pursuit when announcing research outcomes. The rise of computational science, as a way of conducting empirical studies by using mathematical models and simulations, have opened a new range of challenges in this context. The adoption of workflows as a way of detailing the scientific procedure of these experiments, along with the experimental data conservation initiatives that have been undertaken during last decades, have partially eased this problem. However, in order to fully address it, the conservation and reproducibility of the computational equipment related to them must be also considered. The wide range of software and hardware resources required to execute a scientific workflow implies that a comprehensive description detailing what those resources are and how they are arranged is necessary. In this thesis we address the issue of reproducibility of execution environments for scientific workflows, by documenting them in a formalized way, which can be later used to obtain and equivalent one. In order to do so, we propose a set of semantic models for representing and relating the relevant information of those environments, as well as a set of tools that uses these models for generating a description of the infrastructure, and an algorithmic process that consumes these descriptions for deriving a new execution environment specification, which can be enacted into a new equivalent one using virtualization solutions. We apply these three contributions to a set of representative scientific experiments, belonging to different scientific domains, and exposing different software and hardware requirements. The obtained results prove the feasibility of the proposed approach, by successfully reproducing the target experiments under different virtualization environments.
Resumo:
In global scientific experiments with collaborative scenarios involving multinational teams there are big challenges related to data access, namely data movements are precluded to other regions or Clouds due to the constraints on latency costs, data privacy and data ownership. Furthermore, each site is processing local data sets using specialized algorithms and producing intermediate results that are helpful as inputs to applications running on remote sites. This paper shows how to model such collaborative scenarios as a scientific workflow implemented with AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic), a decentralized framework offering a feasible solution to run the workflow activities on distributed data centers in different regions without the need of large data movements. The AWARD workflow activities are independently monitored and dynamically reconfigured and steering by different users, namely by hot-swapping the algorithms to enhance the computation results or by changing the workflow structure to support feedback dependencies where an activity receives feedback output from a successor activity. A real implementation of one practical scenario and its execution on multiple data centers of the Amazon Cloud is presented including experimental results with steering by multiple users.
Resumo:
While workflow technology has gained momentum in the last decade as a means for specifying and enacting computational experiments in modern science, reusing and repurposing existing workflows to build new scientific experiments is still a daunting task. This is partly due to the difficulty that scientists experience when attempting to understand existing workflows, which contain several data preparation and adaptation steps in addition to the scientifically significant analysis steps. One way to tackle the understandability problem is through providing abstractions that give a high-level view of activities undertaken within workflows. As a first step towards abstractions, we report in this paper on the results of a manual analysis performed over a set of real-world scientific workflows from Taverna and Wings systems. Our analysis has resulted in a set of scientific workflow motifs that outline i) the kinds of data intensive activities that are observed in workflows (data oriented motifs), and ii) the different manners in which activities are implemented within workflows (workflow oriented motifs). These motifs can be useful to inform workflow designers on the good and bad practices for workflow development, to inform the design of automated tools for the generation of workflow abstractions, etc.
Resumo:
Los flujos de trabajo científicos han sido adoptados durante la última década para representar los métodos computacionales utilizados en experimentos in silico, así como para dar soporte a sus publicaciones asociadas. Dichos flujos de trabajo han demostrado ser útiles para compartir y reproducir experimentos científicos, permitiendo a investigadores visualizar, depurar y ahorrar tiempo a la hora de re-ejecutar un trabajo realizado con anterioridad. Sin embargo, los flujos de trabajo científicos pueden ser en ocasiones difíciles de entender y reutilizar. Esto es debido a impedimentos como el gran número de flujos de trabajo existentes en repositorios, su heterogeneidad o la falta generalizada de documentación y ejemplos de uso. Además, dado que normalmente es posible implementar un mismo método utilizando algoritmos o técnicas distintas, flujos de trabajo aparentemente distintos pueden estar relacionados a un determinado nivel de abstracción, basándose, por ejemplo, en su funcionalidad común. Esta tesis se centra en la reutilización de flujos de trabajo y su abstracción mediante la exploración de relaciones entre los flujos de trabajo de un repositorio y la extracción de abstracciones que podrían ayudar a la hora de reutilizar otros flujos de trabajo existentes. Para ello, se propone un modelo simple de representación de flujos de trabajo y sus ejecuciones, se analizan las abstracciones típicas que se pueden encontrar en los repositorios de flujos de trabajo, se exploran las prácticas habituales de los usuarios a la hora de reutilizar flujos de trabajo existentes y se describe un método para descubrir abstracciones útiles para usuarios, basadas en técnicas existentes de teoría de grafos. Los resultados obtenidos exponen las abstracciones y prácticas comunes de usuarios en términos de reutilización de flujos de trabajo, y muestran cómo las abstracciones que se extraen automáticamente tienen potencial para ser reutilizadas por usuarios que buscan diseñar nuevos flujos de trabajo. Abstract Scientific workflows have been adopted in the last decade to represent the computational methods used in in silico scientific experiments and their associated research products. Scientific workflows have demonstrated to be useful for sharing and reproducing scientific experiments, allowing scientists to visualize, debug and save time when re-executing previous work. However, scientific workflows may be difficult to understand and reuse. The large amount of available workflows in repositories, together with their heterogeneity and lack of documentation and usage examples may become an obstacle for a scientist aiming to reuse the work from other scientists. Furthermore, given that it is often possible to implement a method using different algorithms or techniques, seemingly disparate workflows may be related at a higher level of abstraction, based on their common functionality. In this thesis we address the issue of reusability and abstraction by exploring how workflows relate to one another in a workflow repository, mining abstractions that may be helpful for workflow reuse. In order to do so, we propose a simple model for representing and relating workflows and their executions, we analyze the typical common abstractions that can be found in workflow repositories, we explore the current practices of users regarding workflow reuse and we describe a method for discovering useful abstractions for workflows based on existing graph mining techniques. Our results expose the common abstractions and practices of users in terms of workflow reuse, and show how our proposed abstractions have potential to become useful for users designing new workflows.
Resumo:
One of the most important advantages of database systems is that the underlying mathematics is rich enough to specify very complex operations with a small number of statements in the database language. This research covers an aspect of biological informatics that is the marriage of information technology and biology, involving the study of real-world phenomena using virtual plants derived from L-systems simulation. L-systems were introduced by Aristid Lindenmayer as a mathematical model of multicellular organisms. Not much consideration has been given to the problem of persistent storage for these simulations. Current procedures for querying data generated by L-systems for scientific experiments, simulations and measurements are also inadequate. To address these problems the research in this paper presents a generic process for data-modeling tools (L-DBM) between L-systems and database systems. This paper shows how L-system productions can be generically and automatically represented in database schemas and how a database can be populated from the L-system strings. This paper further describes the idea of pre-computing recursive structures in the data into derived attributes using compiler generation. A method to allow a correspondence between biologists' terms and compiler-generated terms in a biologist computing environment is supplied. Once the L-DBM gets any specific L-systems productions and its declarations, it can generate the specific schema for both simple correspondence terminology and also complex recursive structure data attributes and relationships.
Resumo:
Cloud computing is increasingly being adopted in different scenarios, like social networking, business applications, scientific experiments, etc. Relying in virtualization technology, the construction of these computing environments targets improvements in the infrastructure, such as power-efficiency and fulfillment of users’ SLA specifications. The methodology usually applied is packing all the virtual machines on the proper physical servers. However, failure occurrences in these networked computing systems can induce substantial negative impact on system performance, deviating the system from ours initial objectives. In this work, we propose adapted algorithms to dynamically map virtual machines to physical hosts, in order to improve cloud infrastructure power-efficiency, with low impact on users’ required performance. Our decision making algorithms leverage proactive fault-tolerance techniques to deal with systems failures, allied with virtual machine technology to share nodes resources in an accurately and controlled manner. The results indicate that our algorithms perform better targeting power-efficiency and SLA fulfillment, in face of cloud infrastructure failures.
Resumo:
RESUMO: A operação de Nissen, por laparoscopia, é considerada a cirurgia antirefluxo mais adequada por ser a que melhor replica a fisiologia normal da válvula gastresofágica na maioria dos doentes com sintomas típicos de doença do refluxo gastresofágico (DRGE). São critérios técnicos o encerramento seguro dos pilares do diafragma e a criação de fundoplicatura completa (360 graus), curta (inferior a dois centímetros), lassa e sem tensão – desiderando para o qual a laqueação proximal dos vasos curtos gástricos é crucial. Realizei a operação de Nissen, por laparoscopia, em sessenta mulheres e quarenta homens com DRGE, sem mortalidade operatória, no Serviço de Cirurgia 6 do Hospital dos Capuchos, CHLC, EPE. Os cem doentes apresentavam média etária de 46 anos e queixas, com tempo de evolução entre 1 e 43 anos, de pirose (90%), regurgitação (80%), azia (73%), epigastralgias (54%). A endoscopia alta revelou esofagite de grau Savary-Miller 0-I (62%), II (23%), III (8%), IV (7%); hérnia de deslizamento (71%), hérnia paraesofágica (8%), sem hérnia (21%); a pHmetria de 24h diagnosticou padrão misto (38%), levantado (20%), deitado (20%), inconclusiva (22%) e a manometria diagnosticou EEI hipotónico (35%), peristálise esofágica normal (88%), hipomotilidade ligeira (5%) e foi omissa (7%). Hérnia hiatal, esofagite grave, ineficácia do controlo sintomático com inibidor da bomba de protões e desejo de descontinuidade terapêutica constituíram as indicações para tratamento cirúrgico. Por celioscopia, efetuei laqueação dos vasos curtos gástricos (70%), cruroplastia e fundoplicatura total (seda 2/0), curta (dimensão média 1,5-2 cm), lassa, sem tensão e sem calibração intraoperatória do esófago. A fundoplicatura de Nissen laparoscópica mostrou-se segura e eficaz no tratamento da DRGE. A sua idoneidade foi ainda comprovada pela normalização da pHmetria de 24 horas e da manometria pós-operatórias, com significado estatístico, num grupo de catorze voluntários assintomáticos. Em catamnese com recuo médio 30,7 meses 94% dos indivíduos persistem assintomáticos. Interrogando-me acerca das repercussões desta operação sobre a microcirculação do fundo gástrico coloquei, como premissa, a possibilidade de na operação de Nissen a laqueação dos vasos curtos poder induzir modificação no diâmetro arteriolar da parede do fundo gástrico. Para pesquisar a influência da laqueação dos vasos curtos gástricos e da fundoplicatura total sobre o calibre arteriolar da parede do estômago no cárdia, no fundo e na região dos vasos curtos gástricos, idealizei um Projeto de investigação experimental em cobaias. O Projeto foi desenvolvido no Centro de Investigação do Departamento de Anatomia da FCM-UNL. Para a sua realização obtive autorização da Comissão Científica e Pedagógica da FCM-UNL, requeri a acreditação como investigador à Direção Geral de Veterinária e, por recorrer à utilização de animais, submeti-o à Comissão de Ética da FCM-UNL, que o aprovou por unanimidade. Para limitar o número de animais utilizados ao mínimo necessário, calculei, por método estatístico, a quantidade de cobaias necessárias. Subdividindo-as num grupo de ensaio (GE), onde realizei a operação de Nissen, e num grupo de controlo (GC), onde apenas procedi a tração gástrica, defini e apliquei protocolos de anestesia, de cirurgia e de eutanásia, segundo os princípios dos 3R – Replacement, Reduction, Refinement da técnica de experimentação humana de Russell e Burch (1959) – uma estrutura ética amplamente aceite para a realização de experimentação científica humanizada com animais. A utilização das técnicas de estudo angiomorfológico permitiu-me analisar e descrever a anatomia normal, a vascularização arterial macroscópica, a microangioarquitetura, por microscopia eletrónica de varrimento de moldes de corrosão vascular, e a histologia da parede do estômago da cobaia. Procedi, também, à definição dos critérios morfológicos que considerei suscetíveis de validação deste modelo animal para o estudo proposto. Por razões académicas, foi necessário abreviar o Projeto encurtando, em cerca de dois anos, o prazo disponível para conclusão do estudo. Apreciando-o com o Gabinete de Análise Epidemiológica e Estatística do Centro de Investigação do CHLC, EPE, optou-se, perante a escassez de elementos após já terem sido recrutados 46 animais, por uma amostra, suplementar, de dimensão de conveniência de oito cobaias (quatro em cada grupo), condicionada pelo limite temporal universitário e pelo respeito pela dignidade dos animais. Neste subgrupo procedi, por microscopia eletrónica de varrimento, à medição dos calibres arteriolares nos moldes vasculares do cárdia, do fundo e da zona dos vasos curtos gástricos tanto no GC como no GE efetuando 469 medições no primeiro e 461 no último. Os dados foram enviados ao Centro de Investigação do CHLC, EPE que procedeu à sua análise estatística (ANOVA). A referida análise revelou que as arteríolas do plexo mucoso e as do plexo submucoso do cárdia, do fundo e da região dos vasos curtos gástricos, mostraram aumento de calibre no GE. O aumento foi, estatisticamente, significativo por ser superior a 50% do calibre do GC. Nos vasos curtos, a diferença foi mais pequena, mas persistiu sendo, estatisticamente, significativa. Os vasos retos dilataram na base, na sua emergência do plexo seroso, apenas no fundo gástrico. Na cobaia a operação de Nissen – fundoplicatura total com laqueação dos vasos curtos gástricos –, provocou vasodilatação arteriolar do fundo gástrico. Considero que essa vasodilatação constituiu acomodação à modificação introduzida e infiro que o mesmo possa acontecer no ser humano. Admito, assim, que também ocorra vasodilatação no ser humano, na sequência da laqueação dos vasos curtos gástricos, pela analogia microvascular entre as duas espécies e que essa vasodilatação corresponda, igualmente, a um mecanismo de adaptação arteriolar visando, por exemplo, suprir a perda incorrida pela laqueação. A associação experimental entre laqueação dos vasos curtos gástricos e realização de fundoplicatura total, que exerce aumento inerente de pressão sobre a JEG, não só não provocou défice da microcirculação do esófago distal ou do estômago proximal como desencadeou um mecanismo de vasodilatação fúndica que reforça o conceito de segurança da operação de Nissen para tratamento da DRGE. -------------- ABSTRACT: The laparoscopic Nissen operation is considered to be the most appropriate antirefluxsurgery because it suitably replicates the standard physiology of the gastroesophageal valve in most patients with typical symptoms of gastroesophageal reflux disease (GERD). The technical criteria includes the safe shutdown of the diafragmatic crura(cruroplasty) and the creation of a complete fundoplication (360 degrees), short (lesser than two inches), floppy and without tension – a goal for which the proximal ligation of the gastric short vessels is crucial. The laparoscopic Nissen operation was performed in sixty women and forty men with GERD, without any operative mortality, at the Surgical Department of the Hospital dos Capuchos, CHLC, EPE. The one hundred patients, averaged 46 years old, complained of heartburn (90%), regurgitation (80%) and upper abdominal pain (54 %). The endoscopy process revealed Savary-Miller esophagitis of grade 0-I (62%), II (23%), III (8%), IV (7%), sliding hernia (71%), paraesophageal hernia (8%) or no herniation (21%). The pHmetry/24h diagnosed mixed pattern (38%), raised (20%), lying (20%) or inconclusive (22%). The manometry diagnosed hypotensive LES (35%), normal esophageal peristalsis (88%), mild hypomotility (5%) and was absent (7%). Hiatal hernia, severe esophagitis, ineffective symptomatic control with proton pump inhibitor and request for treatment discontinuation were the signs for surgical action. A laparoscopic ligation of short gastric vessels (70%), cruroplasty and fundoplication (silk 2/0), short (average size 1.5–2 cm) and floppy, without tension and without intraoperative calibration of the esophagus were thus performed. The laparoscopic Nissen fundoplication behaved safe and effective in treating GERD. In a group of 14 asymptomatic volunteers its reputation was confirmed with statistical significance by normalization of postoperative pHmetry/24h and manometry. 94% of the individuals remained asymptomatic up to 30.7 months (average) in the follow-up. Interrogating myself about the impact of this operation on the microcirculation of the gastric fundus I put premised on the possibility of the ligation of the short gastric vessels in the Nissen procedure can induce changes in the arteriolar diameter in the Wall of the gastric fundus. To explore the influence of ligation of the short gastric vessels and the fundoplication at the arteriolar caliber of the cardia, the fundus and the region of the short vessels of the gastric wall, I designed a project of experimental research in guinea pigs with two interdependent components: one veterinary and another technical where I applied angiomorphological studies. The project was developed at the Research Centre of the Department of Anatomy FCMUNL. For its accomplishment I got permission from the Scientific and Pedagogical Committee of the FCM-UNL, I requested for accreditation as a researcher at the General Directorate of Veterinary and, by resorting to the use of animals I submitted it to the Ethics Committee of the FCM-UNL, which approved it unanimously. The guinea pigs were divided into two experimental groups: an experimental group (EG), in which the Nissen procedure was performed and a control group (CG) in which only a gastric traction was done. Protocols of anesthesia, surgery and euthanasia were applied according to the 3Rs – Replacement, Reduction, Refinement of the technique of human experimentation of Burch and Russell (1959) – a widely accepted ethical framework for conducting scientific experiments using animals humanely. Using histological and angiomorphological techniques, I performed the analysis and the description of the normal, macro and microvascular, anatomy of the guinea pig stomach and I defined the morphological criteria that I considered susceptible for validation of this animal model for the proposed study. By means of scanning electron microscopy I measured the arteriolar calibers of the vascular casts of the cardia, of the fundus and of the short gastric vessels in both CG and EG, making 469 measurements in the former and 461 in the latter. The data were sent to the Research Center of the CHLC which conducted the statistical analysis (ANOVA). The data were sent to the Centre for Research of the CHLC, EPE which proceeded to statistical analysis (ANOVA). This analysis revealed that the arterioles plexus of the mucosal and submucosal plexus of the cardia, fundus and region of the short gastric vessels, showed increased caliber in EG. The increase was statistically significant for being greater than 50% CG gauge. In the short gastric vessels, the difference was smaller, but persisted and statistically significant. Straight vessels were dilated at the base, on its emergence of the plexus serous only in the fundus. In the guinea pig, the Nissen procedure - complete fundoplication with ligation of the short gastric vessels - caused arteriolar vasodilation on the gastric fundus. I believe that this vasodilation constituted some accommodation to the modification introduced and infer that the same might happen in humans. I admit therefore that vasodilation also occurs in humans following the ligation of the short gastric vessels by microvascular analogy between the two species and that this vasodilation corresponds also to na adaptation mechanism arteriolar, for example, to compensate the loss incurred by ligation. The association of experimental ligation of the short gastric vessels with conducting complete fundoplication, which exerts increased pressure on the EGJ, not only did not cause a microcirculation deficit of the distal esophagus or proximal stomach as triggered a mechanism of fundic vasodilation which reinforces the security concept of the Nissen procedure for treatment of GERD.
Resumo:
Promoting environmental and health education is crucial to allow students to make conscious decisions based on scientific criteria. The study is based on the outcomes of an Educational Project implemented with Portuguese students and consisted of several activities, exploring pre-existent Scientific Gardens at the School, aiming to investigate the antibacterial, antitumor and anti-inflammatory properties of plant extracts, with posterior incorporation in soaps and creams. A logo and a webpage were also created. The effectiveness of the project was assessed via the application of a questionnaire (pre- and post-test) and observations of the participants in terms of engagement and interaction with all individuals involved in the project. This project increased the knowledge about autochthonous plants and the potential medical properties of the corresponding plant extracts and increased the awareness about the correct design of scientific experiments and the importance of the use of experimental models of disease. The students regarded their experiences as exciting and valuable and believed that the project helped to improve their understanding and increase their interest in these subjects and in science in general. This study emphasizes the importance of raising students’ awareness on the valorization of autochthonous plants and exploitation of their medicinal properties.
Resumo:
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.
Resumo:
The process of cold storage chambers contributes largely to the quality and longevity of stored products. In recent years, it has been intensified the study of control strategies in order to decrease the temperature change inside the storage chamber and to reduce the electric power consumption. This study has developed a system for data acquisition and process control, in LabVIEW language, to be applied in the cooling system of a refrigerating chamber of 30m³. The use of instrumentation and the application developed fostered the development of scientific experiments, which aimed to study the dynamic behavior of the refrigeration system, compare the performance of control strategies and the heat engine, even due to the controlled temperature, or to the electricity consumption. This system tested the strategies for on-off control, PID and fuzzy. Regarding power consumption, the fuzzy controller showed the best result, saving 10% when compared with other tested strategies.
Resumo:
In the Biodiversity World (BDW) project we have created a flexible and extensible Web Services-based Grid environment for biodiversity researchers to solve problems in biodiversity and analyse biodiversity patterns. In this environment, heterogeneous and globally distributed biodiversity-related resources such as data sets and analytical tools are made available to be accessed and assembled by users into workflows to perform complex scientific experiments. One such experiment is bioclimatic modelling of the geographical distribution of individual species using climate variables in order to predict past and future climate-related changes in species distribution. Data sources and analytical tools required for such analysis of species distribution are widely dispersed, available on heterogeneous platforms, present data in different formats and lack interoperability. The BDW system brings all these disparate units together so that the user can combine tools with little thought as to their availability, data formats and interoperability. The current Web Servicesbased Grid environment enables execution of the BDW workflow tasks in remote nodes but with a limited scope. The next step in the evolution of the BDW architecture is to enable workflow tasks to utilise computational resources available within and outside the BDW domain. We describe the present BDW architecture and its transition to a new framework which provides a distributed computational environment for mapping and executing workflows in addition to bringing together heterogeneous resources and analytical tools.