918 resultados para Generation of test processes
Resumo:
Background Plant-soil interaction is central to human food production and ecosystem function. Thus, it is essential to not only understand, but also to develop predictive mathematical models which can be used to assess how climate and soil management practices will affect these interactions. Scope In this paper we review the current developments in structural and chemical imaging of rhizosphere processes within the context of multiscale mathematical image based modeling. We outline areas that need more research and areas which would benefit from more detailed understanding. Conclusions We conclude that the combination of structural and chemical imaging with modeling is an incredibly powerful tool which is fundamental for understanding how plant roots interact with soil. We emphasize the need for more researchers to be attracted to this area that is so fertile for future discoveries. Finally, model building must go hand in hand with experiments. In particular, there is a real need to integrate rhizosphere structural and chemical imaging with modeling for better understanding of the rhizosphere processes leading to models which explicitly account for pore scale processes.
Resumo:
The present paper presents an application that composes formal poetry in Spanish in a semiautomatic interactive fashion. JASPER is a forward reasoning rule-based system that obtains from the user an intended message, the desired metric, a choice of vocabulary, and a corpus of verses; and, by intelligent adaptation of selected examples from this corpus using the given words, carries out a prose-to-poetry translation of the given message. In the composition process, JASPER combines natural language generation and a set of construction heuristics obtained from formal literature on Spanish poetry.
Resumo:
SIN FINANCIACIÓN
Resumo:
In this work, fabrication processes for daylight guiding systems based on micromirror arrays are developed, evaluated and optimized.Two different approaches are used: At first, nanoimprint lithography is used to fabricate large area micromirrors by means of Substrate Conformal Imprint Lithography (SCIL).Secondly,a new lithography technique is developed using a novel bi-layered photomask to fabricate large area micromirror arrays. The experimental results showing a reproducible stable process, high yield, and is consuming less material, time, cost and effort.
Resumo:
The simulation of ultrafast photoinduced processes is a fundamental step towards the understanding of the underlying molecular mechanism and interpretation/prediction of experimental data. Performing a computer simulation of a complex photoinduced process is only possible introducing some approximations but, in order to obtain reliable results, the need to reduce the complexity must balance with the accuracy of the model, which should include all the relevant degrees of freedom and a quantitatively correct description of the electronic states involved in the process. This work presents new computational protocols and strategies for the parameterisation of accurate models for photochemical/photophysical processes based on state-of-the-art multiconfigurational wavefunction-based methods. The required ingredients for a dynamics simulation include potential energy surfaces (PESs) as well as electronic state couplings, which must be mapped across the wide range of geometries visited during the wavepacket/trajectory propagation. The developed procedures allow to obtain solid and extended databases reducing as much as possible the computational cost, thanks to, e.g., specific tuning of the level of theory for different PES regions and/or direct calculation of only the needed components of vectorial quantities (like gradients or nonadiabatic couplings). The presented approaches were applied to three case studies (azobenzene, pyrene, visual rhodopsin), all requiring an accurate parameterisation but for different reasons. The resulting models and simulations allowed to elucidate the mechanism and time scale of the internal conversion, reproducing or even predicting new transient experiments. The general applicability of the developed protocols to systems with different peculiarities and the possibility to parameterise different types of dynamics on an equal footing (classical vs purely quantum) prove that the developed procedures are flexible enough to be tailored for each specific system, and pave the way for exact quantum dynamics with multiple degrees of freedom.
Resumo:
With an increasing demand for rural resources and land, new challenges are approaching affecting and restructuring the European countryside. While creating opportunities for rural living, it has also opened a discussion on rural gentrification risks. The concept of rural gentrification encircles the influx of new residents leading to an economic upgrade of an area making it unaffordable for local inhabitants to stay in. Rural gentrification occurs in areas perceived as attractive. Paradoxically, in-migrants re-shape their surrounding landscape. Rural gentrification may not only cause displacement of people but also landscape values. Thus, this research aims to understand the twofold role of landscape in rural gentrification theory: as a possible driver to attract residents and as a product shaped by its residents. To understand the potential gentrifiers’ decision process, this research has provided a collection of drivers behind in-migration. Moreover, essential indicators of rural gentrification have been collected from previous studies. Yet, the available indicators do not contain measures to understand related landscape changes. To fill this gap, after analysing established landscape assessment methodologies, evaluating the relevance for assessing gentrification, a new Landscape Assessment approach is proposed. This method introduces a novel approach to capture landscape change caused by gentrification through a historical depth. The measures to study gentrification was applied on Gotland, Sweden. The study showed a population stagnating while the number of properties increased, and housing prices raised. These factors are not indicating positive growth but risks of gentrification. Then, the research applied the proposed Landscape Assessment method for areas exposed to gentrification. Results suggest that landscape change takes place on a local scale and could over time endanger key characteristics. The methodology contributes to a discussion on grasping nuances within the rural context. It has also proven useful for indicating accumulative changes, which is necessary in managing landscape values.
Resumo:
Nowadays, the chemical industry has reached significant goals to produce essential components for human being. The growing competitiveness of the market caused an important acceleration in R&D activities, introducing new opportunities and procedures for the definition of process improvement and optimization. In this dynamicity, sustainability is becoming one of the key aspects for the technological progress encompassing economic, environmental protection and safety aspects. With respect to the conceptual definition of sustainability, literature reports an extensive discussion of the strategies, as well as sets of specific principles and guidelines. However, literature procedures are not completely suitable and applicable to process design activities. Therefore, the development and introduction of sustainability-oriented methodologies is a necessary step to enhance process and plant design. The definition of key drivers as support system is a focal point for early process design decisions or implementation of process modifications. In this context, three different methodologies are developed to support design activities providing criteria and guidelines in a sustainable perspective. In this framework, a set of key Performance Indicators is selected and adopted to characterize the environmental, safety, economic and energetic aspects of a reference process. The methodologies are based on heat and material balances and the level of detailed for input data are compatible with available information of the specific application. Multiple case-studies are defined to prove the effectiveness of the methodologies. The principal application is the polyolefin productive lifecycle chain with particular focus on polymerization technologies. In this context, different design phases are investigated spanning from early process feasibility study to operative and improvements assessment. This flexibility allows to apply the methodologies at any level of design, providing supporting guidelines for design activities, compare alternative solutions, monitor operating process and identify potential for improvements.
Resumo:
Abstract This thesis applies queer theories to the examination of experiences which go beyond queerness. Queer, decolonial, antiracist and feminist new materialist concepts are implemented to the analysis of four case studies dealing with power and art in public spaces. By applying concepts as methodologies, autoethnographic reflections and f(r)ictions as research alternatives, the thesis brings up new diffractive readings from where to perform those scenarios differently. In doing so, the thesis disentangles the historical, material, philosophical, political and disruptive meanings which haunt the four case studies and triggers the artivist potential of their counter-hegemonic narratives.
Resumo:
The pursuit of decarbonization and increased efficiency in internal combustion engines (ICE) is crucial for reducing pollution in the mobility sector. While electrification is a long-term goal, ICE still has a role to play if coupled with innovative technologies. This research project explores various solutions to enhance ICE efficiency and reduce emissions, including Low Temperature Combustion (LTC), Dual fuel combustion with diesel and natural gas, and hydrogen integration. LTC methods like Dual fuel and Reactivity Controlled Compression Ignition (RCCI) show promise in lowering emissions such as NOx, soot, and CO2. Dual fuel Diesel-Natural Gas with hydrogen addition demonstrates improved efficiency, especially at low loads. RCCI Diesel-Gasoline engines offer increased Brake Thermal Efficiency (BTE) compared to standard diesel engines while reducing specific NOx emissions. The study compares 2-Stroke and 4-Stroke engine layouts, optimizing scavenging systems for both aircraft and vehicle applications. CFD analysis enhances specific power output while addressing injection challenges to prevent exhaust short circuits. Additionally, piston bowl shape optimization in Diesel engines running on Dual fuel (Diesel-Biogas) aims to reduce NOx emissions and enhance thermal efficiency. Unconventional 2-Stroke architectures, such as reverse loop scavenged with valves for high-performance cars, opposed piston engines for electricity generation, and small loop scavenged engines for scooters, are also explored. These innovations, alongside ultra-lean hydrogen combustion, offer diverse pathways toward achieving climate neutrality in the transport sector.
Resumo:
In the recent years, autonomous aerial vehicles gained large popularity in a variety of applications in the field of automation. To accomplish various and challenging tasks the capability of generating trajectories has assumed a key role. As higher performances are sought, traditional, flatness-based trajectory generation schemes present their limitations. In these approaches the highly nonlinear dynamics of the quadrotor is, indeed, neglected. Therefore, strategies based on optimal control principles turn out to be beneficial, since in the trajectory generation process they allow the control unit to best exploit the actual dynamics, and enable the drone to perform quite aggressive maneuvers. This dissertation is then concerned with the development of an optimal control technique to generate trajectories for autonomous drones. The algorithm adopted to this end is a second-order iterative method working directly in continuous-time, which, under proper initialization, guarantees quadratic convergence to a locally optimal trajectory. At each iteration a quadratic approximation of the cost functional is minimized and a decreasing direction is then obtained as a linear-affine control law, after solving a differential Riccati equation. The algorithm has been implemented and its effectiveness has been tested on the vectored-thrust dynamical model of a quadrotor in a realistic simulative setup.
Resumo:
Stellar occultations are the most accurate Earth-based astronomy technique to obtain the lateral position of celestial bodies, in the case of natural satellites, their accuracy also depends on the central body to which the satellite orbits. The main goal of this thesis work is to analyze if and how very long baseline interferometry (VLBI) measurements of a body like Jupiter can be used in support to stellar occultations of its natural satellites by reducing the planetary uncertainty at the time of the occultation. In particular, we analyzed the events of the stellar occultations of Callisto (15.01.2024) and Io (02.04.2021). The stellar occultation of Callisto has been predicted and simulated using the stellar occultation reduction analysis (SORA) toolkit while the stellar occultation of Io has been already studied by Morgado et al. We then simulated the VLBI data of Jupiter according to the current JUNO trajectories. The required observation were then used as input of an estimation to which then we performed a covariance analysis on the estimated parameters to retrieve the formal errors (1 − σ uncertainties) at each epoch of the propagation. The results show that the addition of the VLBI slightly improves the uncertainty of Callisto even when Jupiter knowledge is worse while for Io we observed that the VLBI data is especially crucial in the scenario of an a priori uncertainty in Jupiter state of about 10km. Here we can have improvements of the estimated initial states of Io of about 70m, 230m and 900m to the radial, along-track and cross-track directions respectively. Moreover, we have also obtained the propagated errors of the two moons in terms of right ascension and declination which both show uncertainties in the mas level at the occultation time. Finally, we simulated Io and Europa together and we observed that at the time of the stellar occultation of Europa the along-track component of Io is constrained, confirming the coupling between the two inner moons.
Resumo:
Privacy issues and data scarcity in PET field call for efficient methods to expand datasets via synthetic generation of new data that cannot be traced back to real patients and that are also realistic. In this thesis, machine learning techniques were applied to 1001 amyloid-beta PET images, which had undergone a diagnosis of Alzheimer’s disease: the evaluations were 540 positive, 457 negative and 4 unknown. Isomap algorithm was used as a manifold learning method to reduce the dimensions of the PET dataset; a numerical scale-free interpolation method was applied to invert the dimensionality reduction map. The interpolant was tested on the PET images via LOOCV, where the removed images were compared with the reconstructed ones with the mean SSIM index (MSSIM = 0.76 ± 0.06). The effectiveness of this measure is questioned, since it indicated slightly higher performance for a method of comparison using PCA (MSSIM = 0.79 ± 0.06), which gave clearly poor quality reconstructed images with respect to those recovered by the numerical inverse mapping. Ten synthetic PET images were generated and, after having been mixed with ten originals, were sent to a team of clinicians for the visual assessment of their realism; no significant agreements were found either between clinicians and the true image labels or among the clinicians, meaning that original and synthetic images were indistinguishable. The future perspective of this thesis points to the improvement of the amyloid-beta PET research field by increasing available data, overcoming the constraints of data acquisition and privacy issues. Potential improvements can be achieved via refinements of the manifold learning and the inverse mapping stages during the PET image analysis, by exploring different combinations in the choice of algorithm parameters and by applying other non-linear dimensionality reduction algorithms. A final prospect of this work is the search for new methods to assess image reconstruction quality.
Resumo:
The rise of component-based software development has created an urgent need for effective application program interface (API) documentation. Experience has shown that it is hard to create precise and readable documentation. Prose documentation can provide a good overview but lacks precision. Formal methods offer precision but the resulting documentation is expensive to develop. Worse, few developers have the skill or inclination to read formal documentation. We present a pragmatic solution to the problem of API documentation. We augment the prose documentation with executable test cases, including expected outputs, and use the prose plus the test cases as the documentation. With appropriate tool support, the test cases are easy to develop and read. Such test cases constitute a completely formal, albeit partial, specification of input/output behavior. Equally important, consistency between code and documentation is demonstrated by running the test cases. This approach provides an attractive bridge between formal and informal documentation. We also present a tool that supports compact and readable test cases; and generation of test drivers and documentation, and illustrate the approach with detailed case studies. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
La malhonnêteté académique au cours d’épreuves présente des enjeux importants quant à l’intégrité des évaluations. La présence des TIC étant de plus en plus importante en cours de passation dans les épreuves, il est important avec ce mode de récolte de données d’assurer un niveau de sécurité égal ou même supérieur à celui présent lorsqu’un mode de récolte de données traditionnel, le papier-crayon, est utilisé. Il existe plusieurs recherches sur l’utilisation des TIC dans l’évaluation, mais peu d’entre elles traitent des modalités de sécurité lors de l’utilisation des TIC. Dans ce mémoire, treize organisations québécoises ont été rencontrées: six qui utilisaient les TIC dans la passation, cinq qui utilisaient le papier-crayon dans la passation mais qui désiraient utiliser les TIC et deux qui utilisaient le papier-crayon et qui ne désiraient pas utiliser les TIC. Les organisations sont des établissements d’enseignement (primaire, secondaire, collégial, universitaire), des entreprises privées, des organismes gouvernementaux ou municipaux et des ordres professionnels. Des entrevues semi-structurées et une analyse qualitative par présence ou absence de différentes caractéristiques ont permis de documenter les modalités de sécurité liées à la récolte de données en vue de l’évaluation en utilisant les TIC. Ces modalités ont été comparées à celles utilisées lors de l’utilisation du papier-crayon dans la récolte de données en vue de l’évaluation afin de voir comment elles varient lors de l’utilisation des TIC. Les résultats révèlent que l’utilisation des TIC dans la passation complexifie et ajoute des étapes à la préparation des épreuves pour assurer un niveau de sécurité adéquat. Cependant elle permet également de nouvelles fonctions en ce qui concerne le type de questions, l’intégration de multimédia, l’utilisation de questions adaptatives et la génération aléatoire de l’épreuve qui permettent de contrer certaines formes de malhonnêteté académiques déjà présentes avec l’utilisation du papier-crayon dans la passation et pour lesquelles il était difficile d’agir. Toutefois, l’utilisation des TIC dans la passation peut aussi amener de nouvelles possibilités de malhonnêteté académique. Mais si ces dernières sont bien prises en considération, l’utilisation des TIC permet un niveau de sécurité des épreuves supérieur à celui où les données sont récoltées au traditionnel papier-crayon en vue de l’évaluation.
Resumo:
Automation has become increasingly necessary during the software test process due to the high cost and time associated with such activity. Some tools have been proposed to automate the execution of Acceptance Tests in Web applications. However, many of them have important limitations such as the strong dependence on the structure of the HTML pages and the need of manual valuing of the test cases. In this work, we present a language for specifying acceptance test scenarios for Web applications called IFL4TCG and a tool that allows the generation of test cases from these scenarios. The proposed language supports the criterion of Equivalence Classes Partition and the tool allows the generation of test cases that meet different combination strategies (i.e., Each-Choice, Base-Choice and All Combinations). In order to evaluate the effectiveness of the proposed solution, we used the language and the associated tool for designing and executing Acceptance Tests on a module of Sistema Unificado de Administração Pública (SUAP) of Instituto Federal Rio Grande do Norte (IFRN). Four Systems Analysts and one Computer Technician, which work as developers of the that system, participated in the evaluation. Preliminary results showed that IFL4TCG can actually help to detect defects in Web applications