44 resultados para Model-driven development. Domain-specific languages. Case study
Resumo:
The potential and adaptive flexibility of population dynamic P-systems (PDP) to study population dynamics suggests that they may be suitable for modelling complex fluvial ecosystems, characterized by a composition of dynamic habitats with many variables that interact simultaneously. Using as a model a reservoir occupied by the zebra mussel Dreissena polymorpha, we designed a computational model based on P systems to study the population dynamics of larvae, in order to evaluate management actions to control or eradicate this invasive species. The population dynamics of this species was simulated under different scenarios ranging from the absence of water flow change to a weekly variation with different flow rates, to the actual hydrodynamic situation of an intermediate flow rate. Our results show that PDP models can be very useful tools to model complex, partially desynchronized, processes that work in parallel. This allows the study of complex hydroecological processes such as the one presented, where reproductive cycles, temperature and water dynamics are involved in the desynchronization of the population dynamics both, within areas and among them. The results obtained may be useful in the management of other reservoirs with similar hydrodynamic situations in which the presence of this invasive species has been documented.
Resumo:
This study examines the effects of a borderline-specific treatment, called general psychiatric management, on emotional change, outcome and therapeutic alliance of an outpatient presenting with borderline personality disorder. Based on the sequential model of emotional processing, emotional states were assessed in a 10-session setting. The case showed an increase in expressions of distress and no change in therapeutic alliance and tended towards general deterioration. Results suggest emotional processing may play a lesser role in general psychiatric management in early phase treatment than previously hypothezised.
Resumo:
The Chakhama Valley, a remote area in Pakistan-administered Kashmir, was badly damaged by the 7.6-magnitude earthquake that struck India and Pakistan on 8 October 2005. More than 5% of the population lost their lives, and about 90% of the existing housing was irreparably damaged or completely destroyed. In early 2006, the Aga Khan Development Network (AKDN) initiated a multisector, community-driven reconstruction program in the Chakhama Valley on the premise that the scale of the disaster required a response that would address all aspects of people's lives. One important aspect covered the promotion of disaster risk management for sustainable recovery in a safe environment. Accordingly, prevailing hazards (rockfalls, landslides, and debris flow, in addition to earthquake hazards) and existing risks were thoroughly assessed, and the information was incorporated into the main planning processes. Hazard maps, detailed site investigations, and proposals for precautionary measures assisted engineers in supporting the reconstruction of private homes in safe locations to render investments disaster resilient. The information was also used for community-based land use decisions and disaster mitigation and preparedness. The work revealed three main problems: (1) thorough assessment of hazards and incorporation of this assessment into planning processes is time consuming and often little understood by the population directly affected, but it pays off in the long run; (2) relocating people out of dangerous places is a highly sensitive issue that requires the support of clear and forceful government policies; and (3) the involvement of local communities is essential for the success of mitigation and preparedness.
Resumo:
Understanding the run-time behavior of software systems can be a challenging activity. Debuggers are an essential category of tools used for this purpose as they give developers direct access to the running systems. Nevertheless, traditional debuggers rely on generic mechanisms to introspect and interact with the running systems, while developers reason about and formulate domain-specific questions using concepts and abstractions from their application domains. This mismatch creates an abstraction gap between the debugging needs and the debugging support leading to an inefficient and error-prone debugging effort, as developers need to recover concrete domain concepts using generic mechanisms. To reduce this gap, and increase the efficiency of the debugging process, we propose a framework for developing domain-specific debuggers, called the Moldable Debugger, that enables debugging at the level of the application domain. The Moldable Debugger is adapted to a domain by creating and combining domain-specific debugging operations with domain-specific debugging views, and adapts itself to a domain by selecting, at run time, appropriate debugging operations and views. To ensure the proposed model has practical applicability (i.e., can be used in practice to build real debuggers), we discuss, from both a performance and usability point of view, three implementation strategies. We further motivate the need for domain-specific debugging, identify a set of key requirements and show how our approach improves debugging by adapting the debugger to several domains.
Resumo:
When a project is realized in a globalized environment, multiple stakeholders from different organizations work on the same system. Depending on the stakeholders and their organizations, various (possibly overlapping) concerns are raised in the development of the system. In this context a Domain Specific Language (DSL) supports the work of a group of stakeholders who are responsible for addressing a specific set of concerns. This chapter identifies the open challenges arising from the coordination of globalized domain-specific languages. We identify two types of coordination: technical coordination and social coordination. After presenting an overview of the current state of the art, we discuss first the open challenges arising from the composition of multiple DSLs, and then the open challenges associated to the collaboration in a globalized environment.
Resumo:
Object-oriented meta-languages such as MOF or EMOF are often used to specify domain specific languages. However, these meta-languages lack the ability to describe behavior or operational semantics. Several approaches used a subset of Java mixed with OCL as executable meta-languages. In this paper, we report our experience of using Smalltalk as an executable and integrated meta-language. We validated this approach in incrementally building over the last decade, Moose, a meta-described reengineering environment. The reflective capabilities of Smalltalk support a uniform way of letting the base developer focus on his tasks while at the same time allowing him to meta-describe his domain model. The advantage of our this approach is that the developer uses the same tools and environment
Resumo:
The aim of the present single case study was to investigate oculomotor recovery in a patient with simultanagnosia due to biparietal hypoxic lesions. Applying visual exploration as well as basic oculomotor tasks in three consecutive test sessions--i.e. 8 weeks, 14 weeks, and 37 weeks after brain damage had occurred--differential recovery was observed. While visual exploration remarkably improved, an impaired disengagement of attention persisted. The improvement of exploration behaviour is interpreted within an oculomotor network theory and implications for a deficit-specific recovery from simultanagnosia are discussed.
Resumo:
Despite widespread use of species-area relationships (SARs), dispute remains over the most representative SAR model. Using data of small-scale SARs of Estonian dry grassland communities, we address three questions: (1) Which model describes these SARs best when known artifacts are excluded? (2) How do deviating sampling procedures (marginal instead of central position of the smaller plots in relation to the largest plot; single values instead of average values; randomly located subplots instead of nested subplots) influence the properties of the SARs? (3) Are those effects likely to bias the selection of the best model? Our general dataset consisted of 16 series of nested-plots (1 cm(2)-100 m(2), any-part system), each of which comprised five series of subplots located in the four corners and the centre of the 100-m(2) plot. Data for the three pairs of compared sampling designs were generated from this dataset by subsampling. Five function types (power, quadratic power, logarithmic, Michaelis-Menten, Lomolino) were fitted with non-linear regression. In some of the communities, we found extremely high species densities (including bryophytes and lichens), namely up to eight species in 1 cm(2) and up to 140 species in 100 m(2), which appear to be the highest documented values on these scales. For SARs constructed from nested-plot average-value data, the regular power function generally was the best model, closely followed by the quadratic power function, while the logarithmic and Michaelis-Menten functions performed poorly throughout. However, the relative fit of the latter two models increased significantly relative to the respective best model when the single-value or random-sampling method was applied, however, the power function normally remained far superior. These results confirm the hypothesis that both single-value and random-sampling approaches cause artifacts by increasing stochasticity in the data, which can lead to the selection of inappropriate models.
Resumo:
In this paper we compare the performance of two image classification paradigms (object- and pixel-based) for creating a land cover map of Asmara, the capital of Eritrea and its surrounding areas using a Landsat ETM+ imagery acquired in January 2000. The image classification methods used were maximum likelihood for the pixel-based approach and Bhattacharyya distance for the object-oriented approach available in, respectively, ArcGIS and SPRING software packages. Advantages and limitations of both approaches are presented and discussed. Classifications outputs were assessed using overall accuracy and Kappa indices. Pixel- and object-based classification methods result in an overall accuracy of 78% and 85%, respectively. The Kappa coefficient for pixel- and object-based approaches was 0.74 and 0.82, respectively. Although pixel-based approach is the most commonly used method, assessment and visual interpretation of the results clearly reveal that the object-oriented approach has advantages for this specific case-study.
Resumo:
The mid-Holocene (6 kyr BP; thousand years before present) is a key period to study the consistency between model results and proxy-based reconstruction data as it corresponds to a standard test for models and a reasonable number of proxy-based records is available. Taking advantage of this relatively large amount of information, we have compared a compilation of 50 air and sea surface temperature reconstructions with the results of three simulations performed with general circulation models and one carried out with LOVECLIM, a model of intermediate complexity. The conclusions derived from this analysis confirm that models and data agree on the large-scale spatial pattern but the models underestimate the magnitude of some observed changes and that large discrepancies are observed at the local scale. To further investigate the origin of those inconsistencies, we have constrained LOVECLIM to follow the signal recorded by the proxies selected in the compilation using a data-assimilation method based on a particle filter. In one simulation, all the 50 proxy-based records are used while in the other two only the continental or oceanic proxy-based records constrain the model results. As expected, data assimilation leads to improving the consistency between model results and the reconstructions. In particular, this is achieved in a robust way in all the experiments through a strengthening of the westerlies at midlatitude that warms up northern Europe. Furthermore, the comparison of the LOVECLIM simulations with and without data assimilation has also objectively identified 16 proxy-based paleoclimate records whose reconstructed signal is either incompatible with the signal recorded by some other proxy-based records or with model physics.
Resumo:
Alpine grasslands are an important source of fodder for the cattle of Alpine farmers. Only during the short summer season can these pastures be used for grazing. With the anticipated climate change, it is likely that plant production – and thus the fodder basis for the cattle – will be influenced. Investigating the dependence of biomass production on topoclimatic factors will allow us to better understand how anticipated climate change may influence this traditional Alpine farming system. Because small-scale topoclimatological variations of the main meteorological variables: temperature, humidity, precipitation, shortwave incoming radiation and wind speed are not easily derived from available long-term climate stations in mountainous terrain, it was our goal to investigate the topoclimatic variations over the pastures belonging to the Alp Weissenstein research station north of the Albula Pass in the eastern Swiss Alps. We present a basic assessment of current topoclimatic conditions as a site characterization for ongoing ecological climate change studies. To be able to link short-term studies with long-term climate records, we related agrometeorological measurements with those of surrounding long-term sites run by MeteoSwiss, both on valley bottoms (Davos, Samedan), and on mountain tops (Weissfluhjoch, Piz Corvatsch). We found that the Davos climate station north of the study area is most closely correlated with the local climate of Alp Weissenstein, although a much closer site (Samedan) exists on the other side of the Albula Pass. Mountain top stations, however, did not provide a convincing approximation for the climate at Alp Weissenstein. Direct comparisons of near-surface measurements from a set of 11 small weather stations distributed over the domain where cattle and sheep are grazed indicate that nocturnal minimum air temperature and minimum vapor pressure deficit are mostly governed by the altitudinal gradient, whereas daily maxima – including also wind speed – are more strongly depending on vegetation cover and less on the altitude.
Resumo:
The Penninic nappes in the Swiss Alps formed during continental collision between the Adriatic and European plates in Cenozoic times. Although intensely studied, the finite geometry of the basement-bearing Penninic nappes in western Switzerland has remained a matter of debate for decades (e.g., “Siviez-Mischabel dilemma”) and the paleogeographic origin of various nappes has been disputed. Here, we present new structural data for the central part of the Penninic Bernard nappe complex, which contains pre-Permian basement and Permo-Mesozoic metasedimentary units. Our lithological and structural observations indicate that the discrepancy between the different structural models proposed for the Bernard nappe complex can be explained by a lateral discontinuity. In the west, the presence of a Permian graben caused complex isoclinal folding, whereas in the east, the absence of such a graben resulted mainly in imbricate thrusting. The overall geometry of the Bernard nappe complex is the result of three main deformation phases: (1) detachment of Mesozoic cover sediments along Triassic evaporites (Evolène phase) during the early stages of collision, (2) Eocene top-to-the-N(NW) nappe stacking (Anniviers phase), and (3) subsequent backfolding and backshearing (Mischabel phase). The southward localized backshearing is key to understand the structural position and paleogeographic origin of units, such as the Frilihorn and Cimes Blanches “nappes” and the Antrona ophiolites. Based on these observations, we present a new tectonic model for the entire Penninic region of western Switzerland and discuss this model in terms of continental collision zone processes.