919 resultados para Probability Pattern comparison Evaluation and interpretation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

At a time when disciplined inference and decision making under uncertainty represent common aims to participants in legal proceedings, the scientific community is remarkably heterogenous in its attitudes as to how these goals ought to be achieved. Probability and decision theory exert a considerable influence, and we think by all reason rightly do so, but they go against a mainstream of thinking that does not embrace-or is not aware of-the 'normative' character of this body of theory. It is normative, in the sense understood in this article, in that it prescribes particular properties, typically (logical) coherence, to which reasoning and decision making ought to conform. Disregarding these properties can result in diverging views which are occasionally used as an argument against the theory, or as a pretext for not following it. Typical examples are objections according to which people, both in everyday life but also individuals involved at various levels in the judicial process, find the theory difficult to understand and to apply. A further objection is that the theory does not reflect how people actually behave. This article aims to point out in what sense these examples misinterpret the analytical framework in its normative perspective. Through examples borrowed mostly from forensic science contexts, it is argued that so-called intuitive scientific attitudes are particularly liable to such misconceptions. These attitudes are contrasted with a statement of the actual liberties and constraints of probability and decision theory and the view according to which this theory is normative.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cell death is essential for a plethora of physiological processes, and its deregulation characterizes numerous human diseases. Thus, the in-depth investigation of cell death and its mechanisms constitutes a formidable challenge for fundamental and applied biomedical research, and has tremendous implications for the development of novel therapeutic strategies. It is, therefore, of utmost importance to standardize the experimental procedures that identify dying and dead cells in cell cultures and/or in tissues, from model organisms and/or humans, in healthy and/or pathological scenarios. Thus far, dozens of methods have been proposed to quantify cell death-related parameters. However, no guidelines exist regarding their use and interpretation, and nobody has thoroughly annotated the experimental settings for which each of these techniques is most appropriate. Here, we provide a nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls. These guidelines are intended for investigators who study cell death, as well as for reviewers who need to constructively critique scientific reports that deal with cellular demise. Given the difficulties in determining the exact number of cells that have passed the point-of-no-return of the signaling cascades leading to cell death, we emphasize the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents the calibration and comparison of two systems, a machine vision system that uses 3 channel RGB images and a line scanning spectral system. Calibration. is the process of checking and adjusting the accuracy of a measuring instrument by comparing it with standards. For the RGB system self-calibrating methods for finding various parameters of the imaging device were developed. Color calibration was done and the colors produced by the system were compared to the known colors values of the target. Software drivers for the Sony Robot were also developed and a mechanical part to connect a camera to the robot was also designed. For the line scanning spectral system, methods for the calibrating the alignment of the system and the measurement of the dimensions of the line scanned by the system were developed. Color calibration of the spectral system is also presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS), an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aims of this study were to determine whether standard base excess (SBE) is a useful diagnostic tool for metabolic acidosis, whether metabolic acidosis is clinically relevant in daily evaluation of critically ill patients, and to identify the most robust acid-base determinants of SBE. Thirty-one critically ill patients were enrolled. Arterial blood samples were drawn at admission and 24 h later. SBE, as calculated by Van Slyke's (SBE VS) or Wooten's (SBE W) equations, accurately diagnosed metabolic acidosis (AUC = 0.867, 95%CI = 0.690-1.043 and AUC = 0.817, 95%CI = 0.634-0.999, respectively). SBE VS was weakly correlated with total SOFA (r = -0.454, P < 0.001) and was similar to SBE W (r = -0.482, P < 0.001). All acid-base variables were categorized as SBE VS <-2 mEq/L or SBE VS <-5 mEq/L. SBE VS <-2 mEq/L was better able to identify strong ion gap acidosis than SBE VS <-5 mEq/L; there were no significant differences regarding other variables. To demonstrate unmeasured anions, anion gap (AG) corrected for albumin (AG A) was superior to AG corrected for albumin and phosphate (AG A+P) when strong ion gap was used as the standard method. Mathematical modeling showed that albumin level, apparent strong ion difference, AG A, and lactate concentration explained SBE VS variations with an R² = 0.954. SBE VS with a cut-off value of <-2 mEq/L was the best tool to diagnose clinically relevant metabolic acidosis. To analyze the components of SBE VS shifts at the bedside, AG A, apparent strong ion difference, albumin level, and lactate concentration are easily measurable variables that best represent the partitioning of acid-base derangements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A statistical–dynamical downscaling (SDD) approach for the regionalization of wind energy output (Eout) over Europe with special focus on Germany is proposed. SDD uses an extended circulation weather type (CWT) analysis on global daily mean sea level pressure fields with the central point being located over Germany. Seventy-seven weather classes based on the associated CWT and the intensity of the geostrophic flow are identified. Representatives of these classes are dynamically downscaled with the regional climate model COSMO-CLM. By using weather class frequencies of different data sets, the simulated representatives are recombined to probability density functions (PDFs) of near-surface wind speed and finally to Eout of a sample wind turbine for present and future climate. This is performed for reanalysis, decadal hindcasts and long-term future projections. For evaluation purposes, results of SDD are compared to wind observations and to simulated Eout of purely dynamical downscaling (DD) methods. For the present climate, SDD is able to simulate realistic PDFs of 10-m wind speed for most stations in Germany. The resulting spatial Eout patterns are similar to DD-simulated Eout. In terms of decadal hindcasts, results of SDD are similar to DD-simulated Eout over Germany, Poland, Czech Republic, and Benelux, for which high correlations between annual Eout time series of SDD and DD are detected for selected hindcasts. Lower correlation is found for other European countries. It is demonstrated that SDD can be used to downscale the full ensemble of the Earth System Model of the Max Planck Institute (MPI-ESM) decadal prediction system. Long-term climate change projections in Special Report on Emission Scenarios of ECHAM5/MPI-OM as obtained by SDD agree well to the results of other studies using DD methods, with increasing Eout over northern Europe and a negative trend over southern Europe. Despite some biases, it is concluded that SDD is an adequate tool to assess regional wind energy changes in large model ensembles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigated the processes of how adult readers evaluate and revise their situation model during reading by monitoring their eye movements as they read narrative texts and subsequent critical sentences. In each narrative text, a short introduction primed a knowledge-based inference, followed by a target concept that was either expected (e.g., “oven”) or unexpected (e.g., “grill”) in relation to the inferred concept. Eye movements showed that readers detected a mismatch between the new unexpected information and their prior interpretation, confirming their ability to evaluate inferential information. Just below the narrative text, a critical sentence included a target word that was either congruent (e.g., “roasted”) or incongruent (e.g., “barbecued”) with the expected but not the unexpected concept. Readers spent less time reading the congruent than the incongruent target word, reflecting the facilitation of prior information. In addition, when the unexpected (but not expected) concept had been presented, participants with lower verbal (but not visuospatial) working memory span exhibited longer reading times and made more regressions (from the critical sentence to previous information) on encountering congruent information, indicating difficulty in inhibiting their initial incorrect interpretation and revising their situation model

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In vitro fermentation techniques (IVFT) have been widely used to evaluate the nutritivevalue of feeds for ruminants and in the last decade to assess the effect of different nutritionalstrategies on methane (CH4) production. However, many technical factors may influencethe results obtained. The present review has been prepared by the ‘Global Network’ FACCE-JPI international research consortium to provide a critical evaluation of the main factorsthat need to be considered when designing, conducting and interpreting IVFT experimentsthat investigate nutritional strategies to mitigate CH4emission from ruminants. Given theincreasing and wide-scale use of IVFT, there is a need to critically review reports in the lit-erature and establish what criteria are essential to the establishment and implementationof in vitro techniques. Key aspects considered include: i) donor animal species and numberof animal used, ii) diet fed to donor animals, iii) collection and processing of rumen fluidas inoculum, iv) choice of substrate and incubation buffer, v) incubation procedures andCH4measurements, vi) headspace gas composition and vii) comparability of in vitro andin vivo measurements. Based on an evaluation of experimental evidence, a set of techni-cal recommendations are presented to harmonize IVFT for feed evaluation, assessment ofrumen function and CH4production.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urothelial bladder carcinoma (UBC) is heterogeneous in its pathology and clinical behaviour. Evaluation of prognostic and predictive biomarkers is necessary, in order to produce personalised treatment options. The present study used immunohistochemistry to evaluate UBC sections containing tumour and non-tumour areas from 76 patients, for the detection of p-mTOR, CD31 and D2-40 (blood and lymphatic vessels identification, respectively). Of the non-tumour and tumour sections, 36 and 20% were scored positive for p-mTOR expression, respectively. Immunoexpression was observed in umbrella cells from non-tumour urothelium, in all cell layers from non-muscle-invasive (NMI) tumours (including expression in superficial cells), and in spots of cells from muscle-invasive (MI) tumours. Positive expression decreased from non-tumour to tumour urothelium, and from pTl/pTis to pT3/pT4 tumours; however, the few pT3/pT4 positive cases had worse survival rates, with 5-year disease-free survival being significantly lower. Angiogenesis occurrence was impaired in pT3/pT4 tumours that did not express p-mTOR. In conclusion, p-mTOR expression in non-tumour umbrella cells is likely a reflection of their metabolic plasticity, and extension to the inner layers of the urothelium in NMI tumours is consistent with an enhanced malignant potential. The expression in cell spots in a few MI tumours and absence of expression in the remaining tumours is intriguing and requires further research. Additional studies regarding the up- and downstream effectors of the mTOR pathway should be conducted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite several clinical tests that have been developed to qualitatively describe complex motor tasks by functional testing, these methods often depend on clinicians' interpretation, experience and training, which make the assessment results inconsistent, without the precision required to objectively assess the effect of the rehabilitative intervention. A more detailed characterization is required to fully capture the various aspects of motor control and performance during complex movements of lower and upper limbs. The need for cost-effective and clinically applicable instrumented tests would enable quantitative assessment of performance on a subject-specific basis, overcoming the limitations due to the lack of objectiveness related to individual judgment, and possibly disclosing subtle alterations that are not clearly visible to the observer. Postural motion measurements at additional locations, such as lower and upper limbs and trunk, may be necessary in order to obtain information about the inter-segmental coordination during different functional tests involved in clinical practice. With these considerations in mind, this Thesis aims: i) to suggest a novel quantitative assessment tool for the kinematics and dynamics evaluation of a multi-link kinematic chain during several functional motor tasks (i.e. squat, sit-to-stand, postural sway), using one single-axis accelerometer per segment, ii) to present a novel quantitative technique for the upper limb joint kinematics estimation, considering a 3-link kinematic chain during the Fugl-Meyer Motor Assessment and using one inertial measurement unit per segment. The suggested methods could have several positive feedbacks from clinical practice. The use of objective biomechanical measurements, provided by inertial sensor-based technique, may help clinicians to: i) objectively track changes in motor ability, ii) provide timely feedback about the effectiveness of administered rehabilitation interventions, iii) enable intervention strategies to be modified or changed if found to be ineffective, and iv) speed up the experimental sessions when several subjects are asked to perform different functional tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cell death is essential for a plethora of physiological processes, and its deregulation characterizes numerous human diseases. Thus, the in-depth investigation of cell death and its mechanisms constitutes a formidable challenge for fundamental and applied biomedical research, and has tremendous implications for the development of novel therapeutic strategies. It is, therefore, of utmost importance to standardize the experimental procedures that identify dying and dead cells in cell cultures and/or in tissues, from model organisms and/or humans, in healthy and/or pathological scenarios. Thus far, dozens of methods have been proposed to quantify cell death-related parameters. However, no guidelines exist regarding their use and interpretation, and nobody has thoroughly annotated the experimental settings for which each of these techniques is most appropriate. Here, we provide a nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls. These guidelines are intended for investigators who study cell death, as well as for reviewers who need to constructively critique scientific reports that deal with cellular demise. Given the difficulties in determining the exact number of cells that have passed the point-of-no-return of the signaling cascades leading to cell death, we emphasize the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este Proyecto de Fin de Carrera presenta un prototipo de aplicación móvil híbrida multi-plataforma para Android y iOS. Las aplicaciones móviles híbridas son una combinación de aplicaciones web móviles y aplicaciones móviles nativas. Se desarrollan parcialmente con tecnologías web y pueden acceder a la capa nativa y sensores del teléfono. Para el usuario se presentan como aplicaciones nativas, ya que se pueden descargar de las tiendas de aplicaciones y son instaladas en el dispositivo. El prototipo consiste en la migración del módulo de noticias financieras de las aplicaciones actuales para móviles de una compañía bancaria reimplementándolo como aplicación híbrida utilizando uno de los entornos de desarrollo disponibles en el mercado para este propósito. El desarrollo de aplicaciones híbridas puede ahorrar tiempo y dinero cuando se pretende alcanzar más de una plataforma móvil. El objetivo es la evaluación de las ventajas e inconvenientes que ofrece el desarrollo de aplicaciones híbridas en términos de reducción de costes, tiempo de desarrollo y resultado final de la aplicación. El proyecto consta de varias fases. Durante la primera fase se realiza un estudio sobre las aplicaciones híbridas que podemos encontrar hoy en día en el mercado utilizando los ejemplos de linkedIn, Facebook y Financial times. Se hace hincapié en las tecnologías utilizadas, uso de la red móvil y problemas encontrados. Posteriormente se realiza una comparación de distintos entornos de desarrollo multi-plataforma para aplicaciones híbridas en términos de la estrategia utilizada, plataformas soportadas, lenguajes de programación, acceso a capacidades nativas de los dispositivos y licencias de uso. Esta primera fase da como resultado la elección del entorno de desarrollo más adecuado a las exigencias del proyecto, que es PhoneGap, y continua con un análisis más detallado de dicho entorno en cuanto a su arquitectura, características y componentes. La siguiente fase comienza con un estudio de las aplicaciones actuales de la compañía para extraer el código fuente necesario y adaptarlo a la arquitectura que tendrá la aplicación. Para la realización del prototipo se hace uso de la característica que ofrece PhoneGap para acceder a la capa nativa del dispositivo, esto es, el uso de plugins. Se diseña y desarrolla un plugin que permite acceder a la capa nativa para cada plataforma. Una vez desarrollado el prototipo para la plataforma Android, se migra y adapta para la plataforma iOS. Por último se hace una evaluación de los prototipos en cuanto a su facilidad y tiempo de desarrollo, rendimiento, funcionalidad y apariencia de la interfaz de usuario. ABSTRACT. This bachelor's thesis presents a prototype of a hybrid cross-platform mobile application for Android and iOS. Hybrid mobile applications are a combination of mobile web and mobile native applications. They are built partially with web technologies and they can also access native features and sensors of the device. For a user, they look like native applications as they are downloaded from the application stores and installed on the device. This prototype consists of the migration of the financial news module of current mobile applications from a financial bank reimplementing them as a hybrid application using one of the frameworks available in the market for that purpose. Development of applications on a hybrid way can help reducing costs and effort when targeting more than one platform. The target of the project is the evaluation of the advantages and disadvantages that hybrid development can offer in terms of reducing costs and efforts and the final result of the application. The project starts with an analysis of successfully released hybrid applications using the examples of linkedIn, Facebook and Financial Times, emphasizing the different used technologies, the transmitted network data and the encountered problems during the development. This analysis is followed by a comparison of most popular hybrid crossplatform development frameworks in terms of the different approaches, supported platforms, programming languages, access to native features and license. This first stage has the outcome of finding the development framework that best fits to the requirements of the project, that is PhoneGap, and continues with a deeper analysis of its architecture, features and components. Next stage analyzes current company's applications to extract the needed source code and adapt it to the architecture of the prototype. For the realization of the application, the feature that PhoneGap offers to access the native layer of the device is used. This feature is called plugin. A custom plugin is designed and developed to access the native layer of each targeted platform. Once the prototype is finished for Android, it is migrated and adapted to the iOS platform. As a final conclusion the prototypes are evaluated in terms of ease and time of development, performance, functionality and look and feel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context traditionally has been regarded in vision research as a determinant for the interpretation of sensory information on the basis of previously acquired knowledge. Here we propose a novel, complementary perspective by showing that context also specifically affects visual category learning. In two experiments involving sets of Compound Gabor patterns we explored how context, as given by the stimulus set to be learned, affects the internal representation of pattern categories. In Experiment 1, we changed the (local) context of the individual signal classes by changing the configuration of the learning set. In Experiment 2, we varied the (global) context of a fixed class configuration by changing the degree of signal accentuation. Generalization performance was assessed in terms of the ability to recognize contrast-inverted versions of the learning patterns. Both contextual variations yielded distinct effects on learning and generalization thus indicating a change in internal category representation. Computer simulations suggest that the latter is related to changes in the set of attributes underlying the production rules of the categories. The implications of these findings for phenomena of contrast (in)variance in visual perception are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Damages during extreme wind events highlight the weaknesses of mechanical fasteners at the roof-to-wall connections in residential timber frame buildings. The allowable capacity of the metal fasteners is based on results of unidirectional component testing that do not simulate realistic tri-axial aerodynamic loading effects. The first objective of this research was to simulate hurricane effects and study hurricane-structure interaction at full-scale, facilitating better understanding of the combined impacts of wind, rain, and debris on inter-component connections at spatial and temporal scales. The second objective was to evaluate the performance of a non-intrusive roof-to-wall connection system using fiber reinforced polymer (FRP) materials and compare its load capacity to the capacity of an existing metal fastener under simulated aerodynamic loads. ^ The Wall of Wind (WoW) testing performed using FRP connections on a one-story gable-roof timber structure instrumented with a variety of sensors, was used to create a database on aerodynamic and aero-hydrodynamic loading on roof-to-wall connections tested under several parameters: angles of attack, wind-turbulence content, internal pressure conditions, with and without effects of rain. Based on the aerodynamic loading results obtained from WoW tests, sets of three force components (tri-axial mean loads) were combined into a series of resultant mean forces, which were used to test the FRP and metal connections in the structures laboratory up to failure. A new component testing system and test protocol were developed for testing fasteners under simulated triaxial loading as opposed to uni-axial loading. The tri-axial and uni-axial test results were compared for hurricane clips. Also, comparison was made between tri-axial load capacity of FRP and metal connections. ^ The research findings demonstrate that the FRP connection is a viable option for use in timber roof-to-wall connection system. Findings also confirm that current testing methods of mechanical fasteners tend to overestimate the actual load capacities of a connector. Additionally, the research also contributes to the development a new testing protocol for fasteners using tri-axial simultaneous loads based on the aerodynamic database obtained from the WoW testing. ^