132 resultados para MDA (Model driven architecture)
Resumo:
The potential impacts of extreme water level events on our coasts are increasing as populations grow and sea levels rise. To better prepare for the future, coastal engineers and managers need accurate estimates of average exceedance probabilities for extreme water levels. In this paper, we estimate present day probabilities of extreme water levels around the entire coastline of Australia. Tides and storm surges generated by extra-tropical storms were included by creating a 61-year (1949-2009) hindcast of water levels using a high resolution depth averaged hydrodynamic model driven with meteorological data from a global reanalysis. Tropical cyclone-induced surges were included through numerical modelling of a database of synthetic tropical cyclones equivalent to 10,000 years of cyclone activity around Australia. Predicted water level data was analysed using extreme value theory to construct return period curves for both the water level hindcast and synthetic tropical cyclone modelling. These return period curves were then combined by taking the highest water level at each return period.
Resumo:
In-memory databases have become a mainstay of enterprise computing offering significant performance and scalability boosts for online analytical and (to a lesser extent) transactional processing as well as improved prospects for integration across different applications through an efficient shared database layer. Significant research and development has been undertaken over several years concerning data management considerations of in-memory databases. However, limited insights are available on the impacts of applications and their supportive middleware platforms and how they need to evolve to fully function through, and leverage, in-memory database capabilities. This paper provides a first, comprehensive exposition into how in-memory databases impact Business Pro- cess Management, as a mission-critical and exemplary model-driven integration and orchestration middleware. Through it, we argue that in-memory databases will render some prevalent uses of legacy BPM middleware obsolete, but also open up exciting possibilities for tighter application integration, better process automation performance and some entirely new BPM capabilities such as process-based application customization. To validate the feasibility of an in-memory BPM, we develop a surprisingly simple BPM runtime embedded into SAP HANA and providing for BPMN-based process automation capabilities.
Resumo:
Moose populations are managed for sustainable yield balanced against costs caused by damage to forestry or agriculture and collisions with vehicles. Optimal harvests can be calculated based on a structured population model driven by data on abundance and the composition of bulls, cows, and calves obtained by aerial-survey monitoring during winter. Quotas are established by the respective government agency and licenses are issued to hunters to harvest an animal of specified age or sex during the following autumn. Because the cost of aerial monitoring is high, we use a Management Strategy Evaluation to evaluate the costs and benefits of periodic aerial surveys in the context of moose management. Our on-the-fly "seat of your pants" alternative to independent monitoring is management based solely on the kill of moose by hunters, which is usually sufficient to alert the manager to declines in moose abundance that warrant adjustments to harvest strategies. Harvests are relatively cheap to monitor; therefore, data can be obtained each year facilitating annual adjustments to quotas. Other sources of "cheap" monitoring data such as records of the number of moose seen by hunters while hunting also might be obtained, and may provide further useful insight into population abundance, structure and health. Because conservation dollars are usually limited, the high cost of aerial surveys is difficult to justify when alternative methods exist. © 2012 Elsevier Inc.
Resumo:
One of the main challenges in data analytics is that discovering structures and patterns in complex datasets is a computer-intensive task. Recent advances in high-performance computing provide part of the solution. Multicore systems are now more affordable and more accessible. In this paper, we investigate how this can be used to develop more advanced methods for data analytics. We focus on two specific areas: model-driven analysis and data mining using optimisation techniques.
Resumo:
Five significant problems hinder advances in understanding of the volcanology of kimberlites: (1) kimberlite geology is very model driven; (2) a highly genetic terminology drives deposit or facies interpretation; (3) the effects of alteration on preserved depositional textures have been grossly underestimated; (4) the level of understanding of the physical process significance of preserved textures is limited; and, (5) some inferred processes and deposits are not based on actual, modern volcanological processes. These issues need to be addressed in order to advance understanding of kimberlite volcanological pipe forming processes and deposits. The traditional, steep-sided southern African pipe model (Class I) consists of a steep tapering pipe with a deep root zone, a middle diatreme zone and an upper crater zone (if preserved). Each zone is thought to be dominated by distinctive facies, respectively: hypabyssal kimberlite (HK, descriptively called here massive coherent porphyritic kimberlite), tuffisitic kimberlite breccia (TKB, descriptively here called massive, poorly sorted lapilli tuff) and crater zone facies, which include variably bedded pyroclastic kimberlite and resedimented and reworked volcaniclastic kimberlite (RVK). Porphyritic coherent kimberlite may, however, also be emplaced at different levels in the pipe, as later stage intrusions, as well as dykes in the surrounding country rock. The relationship between HK and TKB is not always clear. Sub-terranean fluidisation as an emplacement process is a largely unsubstantiated hypothesis; modern in-vent volcanological processes should initially be considered to explain observed deposits. Crater zone volcaniclastic deposits can occur within the diatreme zone of some pipes, indicating that the pipe was largely empty at the end of the eruption, and subsequently began to fill-in largely through resedimentation and sourcing of pyroclastic deposits from nearby vents. Classes II and III Canadian kimberlite models have a more factual, descriptive basis, but are still inadequately documented given the recency of their discovery. The diversity amongst kimberlite bodies suggests that a three-model classification is an over-simplification. Every kimberlite is altered to varying degrees, which is an intrinsic consequence of the ultrabasic composition of kimberlite and the in-vent context; few preserve original textures. The effects of syn- to post-emplacement alteration on original textures have not been adequately considered to date, and should be back-stripped to identify original textural elements and configurations. Applying sedimentological textural configurations as a guide to emplacement processes would be useful. The traditional terminology has many connotations about spatial position in pipe and of process. Perhaps the traditional terminology can be retained in the industrial situation as a general lithofacies-mining terminological scheme because it is so entrenched. However, for research purposes a more descriptive lithofacies terminology should be adopted to facilitate detailed understanding of deposit characteristics, important variations in these, and the process origins. For example every deposit of TKB is different in componentry, texture, or depositional structure. However, because so many deposits in many different pipes are called TKB, there is an implication that they are all similar and that similar processes were involved, which is far from clear.
Resumo:
Recent evidence indicates that the estrogen receptor-a-negative, androgen receptor (AR)- positive molecular apocrine subtype of breast cancer is driven by AR signaling. The MDA-MB-453 cell line is the prototypical model of this breast cancer subtype; its proliferation is stimulated by androgens such as 5a-dihydrotestosterone (DHT) but inhibited by the progestin medroxyprogesterone acetate (MPA) via AR-mediated mechanisms. We report here that the AR gene in MDAMB- 453 cells contains a G-T transversion in exon 7, resulting in a receptor variant with a glutamine to histidine substitution at amino acid 865 (Q865H) in the ligand binding domain. Compared with wild-type AR, the Q865H variant exhibited reduced sensitivity to DHT and MPA in transactivation assays in MDA-MB-453 and PC-3 cells but did not respond to non-androgenic ligands or receptor antagonists. Ligand binding, molecular modeling, mammalian two-hybrid and immunoblot assays revealed effects of the Q865H mutation on ligand dissociation, AR intramolecular interactions, and receptor stability. Microarray expression profiling demonstrated that DHT and MPA regulate distinct transcriptional programs in MDA-MB-453 cells. Gene Set Enrichment Analysis revealed that DHT- but not MPA-regulated genes were associated with estrogen-responsive transcriptomes from MCF-7 cells and the Wnt signaling pathway. These findings suggest that the divergent proliferative responses of MDA-MB-453 cells to DHT and MPA result from the different genetic programs elicited by these two ligands through the AR-Q865H variant. This work highlights the necessity to characterize additional models of molecular apocrine breast cancer to determine the precise role of AR signaling in this breast cancer subtype. Endocrine-Related Cancer (2012) 19 599–613
Resumo:
This paper deals with the problem of using the data mining models in a real-world situation where the user can not provide all the inputs with which the predictive model is built. A learning system framework, Query Based Learning System (QBLS), is developed for improving the performance of the predictive models in practice where not all inputs are available for querying to the system. The automatic feature selection algorithm called Query Based Feature Selection (QBFS) is developed for selecting features to obtain a balance between the relative minimum subset of features and the relative maximum classification accuracy. Performance of the QBLS system and the QBFS algorithm is successfully demonstrated with a real-world application
Resumo:
This paper argues a model of open systems evolution based on evolutionary thermodynamics and complex system science, as a design paradigm for sustainable architecture. The mechanism of open system evolution is specified in mathematical simulations and theoretical discourses. According to the mechanism, the authors propose an intelligent building model of sustainable design by a holistic information system of the end-users, the building and nature. This information system is used to control the consumption of energy and material resources in building system at microscopic scale, to adapt the environmental performance of the building system to the natural environment at macroscopic scale, for an evolutionary emergence of sustainable performance of buildings.
Resumo:
In Australia as far back as 1993, researchers such as Baladin and Chapmen reported that "18% of the total Australian population and 51% of the population over 60 years of age were identified as having a disability" (2001; p38.2). Statistics such as these are not by any means astonishing, even to members of the general public, and it is widely understood that these are only to increase significantly in our near future. What is particularly surprising however is, in the face of such statistics, the lack of new and creative responses to this demographic shift, particularly by the architecture and construction industries. The common response from a range of sectors seems to be the repetition of a series of models which offer limited, and often undesirable, housing options. It is this against this backdrop, characterized by a lack of original options from mainstream practitioners and relevant government bodies, that the need has arisen to develop alternative models at grass-roots level. This paper reports primarily on the work of one group comprising a not-for-profit organization, a pro-bono design practice group and a local university working together to design a more holistic, emotionally sustainable independent living model of housing for families where a member of the family has a disability. This approach recognizes the limitations of universal design in that it often does not " ... meet all the housing needs that arise for people with moderate to severe disabilities" (Scotts, Margie et al, 2007; p.17). It is hoped that by examining the work of such a collective which is not driven by profit or policy, but rather born with the aim to address first and foremost individual and community need, that better insight can be gained into the real requirements of individuals and families as well as open up a view to new ways of fulfilling them.
Resumo:
Enterprise architecture (EA) management has become an intensively discussed approach to manage enterprise transformations. Despite the popularity and potential of EA, both researchers and practitioners lament a lack of knowledge about the realization of benefits from EA. To determine the benefits from EA, we explore the various dimensions of EA benefit realization and report on the development of a validated and robust measurement instrument. In this paper, we test the reliability and construct validity of the EA benefit realization model (EABRM), which we have designed based on the DeLone & McLean IS success model and findings from exploratory interviews. A confirmatory factor analysis confirms the existence of an impact of five distinct and individually important dimensions on the benefits derived from EA: EA artefact quality, EA infrastructure quality, EA service quality, EA culture, and EA use. The analysis presented in this paper shows that the EA benefit realization model is an instrument that demonstrates strong reliability and validity.
Resumo:
This paper proposes that critical realism can provide a useful theoretical foundation to study enterprise architecture (EA) evolution. Specifically it will investigate the practically relevant and academically challenging question of how EAs integrate the Service-oriented Architecture (SOA). Archer’s Morphogenetic theory is used as an analytical approach to distinguish the architectural conditions under which SOA is introduced, to study the relationships between these conditions and SOA introduction, and to reflect on EA evolution (elaborations) that then take place. The focus lies on the reasons why EA evolution takes place (or not) and what architectural changes happen. This paper uses the findings of a literature review to build an a-priori model informed by Archer’s theory to understand EA evolution in a field that often lacks a solid theoretical groundwork. The findings are threefold. First, EA can evolve on different levels (different integration outcomes). Second, the integration outcomes are classified into three levels: business architecture, information systems architecture and technology architecture. Third, the analytical separation using Archer’s theory is helpful in order to understand how these different integration outcomes are generated.
Resumo:
Enterprise architectures are exposed to fast emerging business and information technology capabilities. A prominent example is the paradigm of service-orientation, which leads to its own architectural requirements and impacts the design and ongoing evolution of Enterprise Architectures. This thesis develops the first theoretical model describing enterprise architecture evolution and outcomes in light of a changing IT landscape such as service-oriented architectures. The developed theoretical model explains enterprise architecture evolution, its main stages and related capabilities. This model can be used to derive theoretical, sound guidelines to manage enterprise architectures in a changing environment.
Resumo:
Design Science is the process of solving ‘wicked problems’ through designing, developing, instantiating, and evaluating novel solutions (Hevner, March, Park and Ram, 2004). Wicked problems are described as agent finitude in combination with problem complexity and normative constraint (Farrell and Hooker, 2013). In Information Systems Design Science, determining that problems are ‘wicked’ differentiates Design Science research from Solutions Engineering (Winter, 2008) and is a necessary part of proving the relevance to Information Systems Design Science research (Hevner, 2007; Iivari, 2007). Problem complexity is characterised as many problem components with nested, dependent and co-dependent relationships interacting through multiple feedback and feed-forward loops. Farrell and Hooker (2013) specifically state for wicked problems “it will often be impossible to disentangle the consequences of specific actions from those of other co-occurring interactions”. This paper discusses the application of an Enterprise Information Architecture modelling technique to disentangle the wicked problem complexity for one case. It proposes that such a modelling technique can be applied to other wicked problems and can lay the foundations for proving relevancy to DSR, provide solution pathways for artefact development, and aid to substantiate those elements required to produce Design Theory.
Resumo:
We describe the development and parameterization of a grid-based model of African savanna vegetation processes. The model was developed with the objective of exploring elephant effects on the diversity of savanna species and structure, and in this formulation concentrates on the relative cover of grass and woody plants, the vertical structure of the woody plant community, and the distribution of these over space. Grid cells are linked by seed dispersal and fire, and environmental variability is included in the form of stochastic rainfall and fire events. The model was parameterized from an extensive review of the African savanna literature; when available, parameter values varied widely. The most plausible set of parameters produced long-term coexistence between woody plants and grass, with the tree-grass balance being more sensitive to changes in parameters influencing demographic processes and drought incidence and response, while less sensitive to fire regime. There was considerable diversity in the woody structure of savanna systems within the range of uncertainty in tree growth rate parameters. Thus, given the paucity of height growth data regarding woody plant species in southern African savannas, managers of natural areas should be cognizant of different tree species growth and damage response attributes when considering whether to act on perceived elephant threats to vegetation. © 2007 Springer Science+Business Media B.V.
Resumo:
We present a new algorithm to compute the voxel-wise genetic contribution to brain fiber microstructure using diffusion tensor imaging (DTI) in a dataset of 25 monozygotic (MZ) twins and 25 dizygotic (DZ) twin pairs (100 subjects total). First, the structural and DT scans were linearly co-registered. Structural MR scans were nonlinearly mapped via a 3D fluid transformation to a geometrically centered mean template, and the deformation fields were applied to the DTI volumes. After tensor re-orientation to realign them to the anatomy, we computed several scalar and multivariate DT-derived measures including the geodesic anisotropy (GA), the tensor eigenvalues and the full diffusion tensors. A covariance-weighted distance was measured between twins in the Log-Euclidean framework [2], and used as input to a maximum-likelihood based algorithm to compute the contributions from genetics (A), common environmental factors (C) and unique environmental ones (E) to fiber architecture. Quanititative genetic studies can take advantage of the full information in the diffusion tensor, using covariance weighted distances and statistics on the tensor manifold.