857 resultados para automatic assessment tool


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pharmaceutical equivalence is an important step towards the confirmation of similarity and Interchangeability among pharmaceutical products, particularly regarding those that win not be tested for bioequivalence. The aim of this paper is to compare traditional difference testing to two one-side equivalence tests in the assessment of pharmaceutical equivalence, by means of equivalence studies between similar, generic and reference products of acyclovir cream, atropine sulfate injection, meropenem for injection, and metronidazole injection. All tests were performed in accordance with the Brazilian Pharmacopeia or the United States Pharmacopeia. All four possible combinations of results arise in these comparisons of difference testing and equivalence testing. Most of the former did not show significant difference, whereas the latter presented similarity. We concluded that equivalence testing is more appropriate than difference testing, what can make it a useful tool to assess pharmaceutical equivalence in products that will not be tested for bioequivalence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Atherosclerosis causes millions of deaths, annually yielding billions in expenses round the world. Intravascular Optical Coherence Tomography (IVOCT) is a medical imaging modality, which displays high resolution images of coronary cross-section. Nonetheless, quantitative information can only be obtained with segmentation; consequently, more adequate diagnostics, therapies and interventions can be provided. Since it is a relatively new modality, many different segmentation methods, available in the literature for other modalities, could be successfully applied to IVOCT images, improving accuracies and uses. Method An automatic lumen segmentation approach, based on Wavelet Transform and Mathematical Morphology, is presented. The methodology is divided into three main parts. First, the preprocessing stage attenuates and enhances undesirable and important information, respectively. Second, in the feature extraction block, wavelet is associated with an adapted version of Otsu threshold; hence, tissue information is discriminated and binarized. Finally, binary morphological reconstruction improves the binary information and constructs the binary lumen object. Results The evaluation was carried out by segmenting 290 challenging images from human and pig coronaries, and rabbit iliac arteries; the outcomes were compared with the gold standards made by experts. The resultant accuracy was obtained: True Positive (%) = 99.29 ± 2.96, False Positive (%) = 3.69 ± 2.88, False Negative (%) = 0.71 ± 2.96, Max False Positive Distance (mm) = 0.1 ± 0.07, Max False Negative Distance (mm) = 0.06 ± 0.1. Conclusions In conclusion, by segmenting a number of IVOCT images with various features, the proposed technique showed to be robust and more accurate than published studies; in addition, the method is completely automatic, providing a new tool for IVOCT segmentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 1992 Rio Earth Summit was of paramount importance in the consolidation and international dissemination of environmental impact assessment, officially recognized as a tool for informed decision-making towards sustainable development (Principle 17, Rio Declaration) and for protection of biodiversity (Article 14, Convention on Biological Diversity). A significant development afterwards was the strengthening of strategic environmental assessment in the design of policies, plans and programs. Both forms of impact assessment can establish the necessary connections between one goal of the Rio+20 Conference - reaching an agreement on the transition to a green economy - and the underpinning decision making processes. Although the Rio+20 Summit has faced challenges to acknowledge its potential, impact assessment should be strengthened in support of both government and business decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Elasmobranch stock assessment studies are usually made through fisheries surveys data. However, in large marine protected areas (MPAs) the use of destructive techniques must be dismissed in order to avoid population impacts. In 2005, while conducting a marine habitat survey in two marine Special Areas of Conservation (Sebadales de Playa de Inglés and Franja Marina de Mogán) in south Gran Canary Island (Canary Islands, Spain) with underwater towed video (UTV) and underwater visual census (UVC) transects, we recognized the opportunity rose to assess elasmobranch populations through UTV. Number of observed species and specimens, overall field work effort and total surveyed area were determined and compared between methods. Mean observations per day per unit of time (MOPUT) and mean observations per day per unit of surveyed area (MOPUA) were also compared through Mann–Whitney rank sum statistical test (α=0.05). Data analysis demonstrated that UTV is a very useful tool to rapidly assess elasmobranch populations in large MPAs in good visibility underwater environments. It can assess larger areas than UVC with the same effort (statistically significant difference found for the MOPUT; p=<0.001), leading to more observed species (5 vs 2) and specimens (46 vs 3) per day of work, with no loss in resolution power (MOPUA values were not significantly different between UTV and UVC; p=0.104).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[ES] El objeto de esta investigación es la obtención de parámetros de fiabilidad a partir de la aplicación de la herramienta validada Team Sport Assessment Procedure (TSAP).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process algebraic architectural description languages provide a formal means for modeling software systems and assessing their properties. In order to bridge the gap between system modeling and system im- plementation, in this thesis an approach is proposed for automatically generating multithreaded object-oriented code from process algebraic architectural descriptions, in a way that preserves – under certain assumptions – the properties proved at the architectural level. The approach is divided into three phases, which are illustrated by means of a running example based on an audio processing system. First, we develop an architecture-driven technique for thread coordination management, which is completely automated through a suitable package. Second, we address the translation of the algebraically-specified behavior of the individual software units into thread templates, which will have to be filled in by the software developer according to certain guidelines. Third, we discuss performance issues related to the suitability of synthesizing monitors rather than threads from software unit descriptions that satisfy specific constraints. In addition to the running example, we present two case studies about a video animation repainting system and the implementation of a leader election algorithm, in order to summarize the whole approach. The outcome of this thesis is the implementation of the proposed approach in a translator called PADL2Java and its integration in the architecture-centric verification tool TwoTowers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]Introduction to Content Language and Integrated Learning ( CLIL) is a 60 hour on line course offered by Universidad de Las Palmas de Gran Canaria within its extracurricular courses. During three years we have had 76 students whose final assessment has been to create a CLIL lesson which was evaluated by their peers. The tool for the peer assessment has been the Moodle platform; forums. The assessment was guided by a list of questions they had to apply when evaluating the lessons, but the replies from the CLIL lesson creators did not have any predetermined structure or guidelines to be followed. We have analysed the assessments and replies by grouping them according to content similarities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nuclear Magnetic Resonance (NMR) is a branch of spectroscopy that is based on the fact that many atomic nuclei may be oriented by a strong magnetic field and will absorb radiofrequency radiation at characteristic frequencies. The parameters that can be measured on the resulting spectral lines (line positions, intensities, line widths, multiplicities and transients in time-dependent experi-ments) can be interpreted in terms of molecular structure, conformation, molecular motion and other rate processes. In this way, high resolution (HR) NMR allows performing qualitative and quantitative analysis of samples in solution, in order to determine the structure of molecules in solution and not only. In the past, high-field NMR spectroscopy has mainly concerned with the elucidation of chemical structure in solution, but today is emerging as a powerful exploratory tool for probing biochemical and physical processes. It represents a versatile tool for the analysis of foods. In literature many NMR studies have been reported on different type of food such as wine, olive oil, coffee, fruit juices, milk, meat, egg, starch granules, flour, etc using different NMR techniques. Traditionally, univariate analytical methods have been used to ex-plore spectroscopic data. This method is useful to measure or to se-lect a single descriptive variable from the whole spectrum and , at the end, only this variable is analyzed. This univariate methods ap-proach, applied to HR-NMR data, lead to different problems due especially to the complexity of an NMR spectrum. In fact, the lat-ter is composed of different signals belonging to different mole-cules, but it is also true that the same molecules can be represented by different signals, generally strongly correlated. The univariate methods, in this case, takes in account only one or a few variables, causing a loss of information. Thus, when dealing with complex samples like foodstuff, univariate analysis of spectra data results not enough powerful. Spectra need to be considered in their wholeness and, for analysing them, it must be taken in consideration the whole data matrix: chemometric methods are designed to treat such multivariate data. Multivariate data analysis is used for a number of distinct, differ-ent purposes and the aims can be divided into three main groups: • data description (explorative data structure modelling of any ge-neric n-dimensional data matrix, PCA for example); • regression and prediction (PLS); • classification and prediction of class belongings for new samples (LDA and PLS-DA and ECVA). The aim of this PhD thesis was to verify the possibility of identify-ing and classifying plants or foodstuffs, in different classes, based on the concerted variation in metabolite levels, detected by NMR spectra and using the multivariate data analysis as a tool to inter-pret NMR information. It is important to underline that the results obtained are useful to point out the metabolic consequences of a specific modification on foodstuffs, avoiding the use of a targeted analysis for the different metabolites. The data analysis is performed by applying chemomet-ric multivariate techniques to the NMR dataset of spectra acquired. The research work presented in this thesis is the result of a three years PhD study. This thesis reports the main results obtained from these two main activities: A1) Evaluation of a data pre-processing system in order to mini-mize unwanted sources of variations, due to different instrumental set up, manual spectra processing and to sample preparations arte-facts; A2) Application of multivariate chemiometric models in data analy-sis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among the experimental methods commonly used to define the behaviour of a full scale system, dynamic tests are the most complete and efficient procedures. A dynamic test is an experimental process, which would define a set of characteristic parameters of the dynamic behaviour of the system, such as natural frequencies of the structure, mode shapes and the corresponding modal damping values associated. An assessment of these modal characteristics can be used both to verify the theoretical assumptions of the project, to monitor the performance of the structural system during its operational use. The thesis is structured in the following chapters: The first introductive chapter recalls some basic notions of dynamics of structure, focusing the discussion on the problem of systems with multiply degrees of freedom (MDOF), which can represent a generic real system under study, when it is excited with harmonic force or in free vibration. The second chapter is entirely centred on to the problem of dynamic identification process of a structure, if it is subjected to an experimental test in forced vibrations. It first describes the construction of FRF through classical FFT of the recorded signal. A different method, also in the frequency domain, is subsequently introduced; it allows accurately to compute the FRF using the geometric characteristics of the ellipse that represents the direct input-output comparison. The two methods are compared and then the attention is focused on some advantages of the proposed methodology. The third chapter focuses on the study of real structures when they are subjected to experimental test, where the force is not known, like in an ambient or impact test. In this analysis we decided to use the CWT, which allows a simultaneous investigation in the time and frequency domain of a generic signal x(t). The CWT is first introduced to process free oscillations, with excellent results both in terms of frequencies, dampings and vibration modes. The application in the case of ambient vibrations defines accurate modal parameters of the system, although on the damping some important observations should be made. The fourth chapter is still on the problem of post processing data acquired after a vibration test, but this time through the application of discrete wavelet transform (DWT). In the first part the results obtained by the DWT are compared with those obtained by the application of CWT. Particular attention is given to the use of DWT as a tool for filtering the recorded signal, in fact in case of ambient vibrations the signals are often affected by the presence of a significant level of noise. The fifth chapter focuses on another important aspect of the identification process: the model updating. In this chapter, starting from the modal parameters obtained from some environmental vibration tests, performed by the University of Porto in 2008 and the University of Sheffild on the Humber Bridge in England, a FE model of the bridge is defined, in order to define what type of model is able to capture more accurately the real dynamic behaviour of the bridge. The sixth chapter outlines the necessary conclusions of the presented research. They concern the application of a method in the frequency domain in order to evaluate the modal parameters of a structure and its advantages, the advantages in applying a procedure based on the use of wavelet transforms in the process of identification in tests with unknown input and finally the problem of 3D modeling of systems with many degrees of freedom and with different types of uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the frame of EU rural policy, always more oriented towards environmental concerns and green livelihoods, Romania stands out for the predominance of rural areas and high nature value farming. The country has to face the challenge of joining the modernization process of rural farming systems with the valorization of local assets. Tourism has emerged as one of the main drivers of change and contributors for a sustainable exploitation of local resources. Rural tourism (RT) can foster the enhancement of the territorial capital (TC), the preservation of public goods (PGs) and the promotion of a more environmental oriented livelihood. The research focuses on a case study area, two valleys from Maramureş, where environmental approaches as diversification strategies are partially explored. The work investigates the role of tourism initiatives for the promotion of green oriented practices. The first part of the work is based on a literature review and interdisciplinary analysis of secondary data to identify the key issues: from rural development policy, to the concept of TC, of PGs and RT. The Romanian development programmes and related strategies are investigated; afterwards the characteristics of the County and the role of RT as diversification and valorisation policies is considered. The second part is based on the collection of primary data through interviews to different local stakeholders (farmers owners of rural guesthouses, local administrators, networks and artisans). The main frequencies are analyzed, a cluster analysis is computed to evaluate the similarities within the most representative groups and a comparative analysis is carried out between the two Valleys. The frame of the analysis is based on a set of indicators following the dimensions of the TC, to assess the characteristics of the local stakeholders and to outline the perception about the local PGs and on the adopted strategies to manage the territory. Final considerations are elaborated and few scenarios are outlined, giving relevance to the importance of improving awareness and creating embeddedness among public-private local stakeholders and resources as a tool for a socio-economic and environmental development of the area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proper hazard identification has become progressively more difficult to achieve, as witnessed by several major accidents that took place in Europe, such as the Ammonium Nitrate explosion at Toulouse (2001) and the vapour cloud explosion at Buncefield (2005), whose accident scenarios were not considered by their site safety case. Furthermore, the rapid renewal in the industrial technology has brought about the need to upgrade hazard identification methodologies. Accident scenarios of emerging technologies, which are not still properly identified, may remain unidentified until they take place for the first time. The consideration of atypical scenarios deviating from normal expectations of unwanted events or worst case reference scenarios is thus extremely challenging. A specific method named Dynamic Procedure for Atypical Scenarios Identification (DyPASI) was developed as a complementary tool to bow-tie identification techniques. The main aim of the methodology is to provide an easier but comprehensive hazard identification of the industrial process analysed, by systematizing information from early signals of risk related to past events, near misses and inherent studies. DyPASI was validated on the two examples of new and emerging technologies: Liquefied Natural Gas regasification and Carbon Capture and Storage. The study broadened the knowledge on the related emerging risks and, at the same time, demonstrated that DyPASI is a valuable tool to obtain a complete and updated overview of potential hazards. Moreover, in order to tackle underlying accident causes of atypical events, three methods for the development of early warning indicators were assessed: the Resilience-based Early Warning Indicator (REWI) method, the Dual Assurance method and the Emerging Risk Key Performance Indicator method. REWI was found to be the most complementary and effective of the three, demonstrating that its synergy with DyPASI would be an adequate strategy to improve hazard identification methodologies towards the capture of atypical accident scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis proposes design methods and test tools, for optical systems, which may be used in an industrial environment, where not only precision and reliability but also ease of use is important. The approach to the problem has been conceived to be as general as possible, although in the present work, the design of a portable device for automatic identification applications has been studied, because this doctorate has been funded by Datalogic Scanning Group s.r.l., a world-class producer of barcode readers. The main functional components of the complete device are: electro-optical imaging, illumination and pattern generator systems. For what concerns the electro-optical imaging system, a characterization tool and an analysis one has been developed to check if the desired performance of the system has been achieved. Moreover, two design tools for optimizing the imaging system have been implemented. The first optimizes just the core of the system, the optical part, improving its performance ignoring all other contributions and generating a good starting point for the optimization of the whole complex system. The second tool optimizes the system taking into account its behavior with a model as near as possible to reality including optics, electronics and detection. For what concerns the illumination and the pattern generator systems, two tools have been implemented. The first allows the design of free-form lenses described by an arbitrary analytical function exited by an incoherent source and is able to provide custom illumination conditions for all kind of applications. The second tool consists of a new method to design Diffractive Optical Elements excited by a coherent source for large pattern angles using the Iterative Fourier Transform Algorithm. Validation of the design tools has been obtained, whenever possible, comparing the performance of the designed systems with those of fabricated prototypes. In other cases simulations have been used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Falls are caused by complex interaction between multiple risk factors which may be modified by age, disease and environment. A variety of methods and tools for fall risk assessment have been proposed, but none of which is universally accepted. Existing tools are generally not capable of providing a quantitative predictive assessment of fall risk. The need for objective, cost-effective and clinically applicable methods would enable quantitative assessment of fall risk on a subject-specific basis. Tracking objectively falls risk could provide timely feedback about the effectiveness of administered interventions enabling intervention strategies to be modified or changed if found to be ineffective. Moreover, some of the fundamental factors leading to falls and what actually happens during a fall remain unclear. Objectively documented and measured falls are needed to improve knowledge of fall in order to develop more effective prevention strategies and prolong independent living. In the last decade, several research groups have developed sensor-based automatic or semi-automatic fall risk assessment tools using wearable inertial sensors. This approach may also serve to detect falls. At the moment, i) several fall-risk assessment studies based on inertial sensors, even if promising, lack of a biomechanical model-based approach which could provide accurate and more detailed measurements of interests (e.g., joint moments, forces) and ii) the number of published real-world fall data of older people in a real-world environment is minimal since most authors have used simulations with healthy volunteers as a surrogate for real-world falls. With these limitations in mind, this thesis aims i) to suggest a novel method for the kinematics and dynamics evaluation of functional motor tasks, often used in clinics for the fall-risk evaluation, through a body sensor network and a biomechanical approach and ii) to define the guidelines for a fall detection algorithm based on a real-world fall database availability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Life Cycle Assessment (LCA) is a chain-oriented tool to evaluate the environment performance of products focussing on the entire life cycle of these products: from the extraction of resources, via manufacturing and use, to the final processing of the disposed products. Through all these stages consumption of resources and pollutant releases to air, water, soil are identified and quantified in Life Cycle Inventory (LCI) analysis. Subsequently to the LCI phase follows the Life Cycle Impact Assessment (LCIA) phase; that has the purpose to convert resource consumptions and pollutant releases in environmental impacts. The LCIA aims to model and to evaluate environmental issues, called impact categories. Several reports emphasises the importance of LCA in the field of ENMs. The ENMs offer enormous potential for the development of new products and application. There are however unanswered questions about the impacts of ENMs on human health and the environment. In the last decade the increasing production, use and consumption of nanoproducts, with a consequent release into the environment, has accentuated the obligation to ensure that potential risks are adequately understood to protect both human health and environment. Due to its holistic and comprehensive assessment, LCA is an essential tool evaluate, understand and manage the environmental and health effects of nanotechnology. The evaluation of health and environmental impacts of nanotechnologies, throughout the whole of their life-cycle by using LCA methodology. This is due to the lack of knowledge in relation to risk assessment. In fact, to date, the knowledge on human and environmental exposure to nanomaterials, such ENPs is limited. This bottleneck is reflected into LCA where characterisation models and consequently characterisation factors for ENPs are missed. The PhD project aims to assess limitations and challenges of the freshwater aquatic ecotoxicity potential evaluation in LCIA phase for ENPs and in particular nanoparticles as n-TiO2.