894 resultados para risk need responsivity model
Resumo:
Integer ambiguity resolution is an indispensable procedure for all high precision GNSS applications. The correctness of the estimated integer ambiguities is the key to achieving highly reliable positioning, but the solution cannot be validated with classical hypothesis testing methods. The integer aperture estimation theory unifies all existing ambiguity validation tests and provides a new prospective to review existing methods, which enables us to have a better understanding on the ambiguity validation problem. This contribution analyses two simple but efficient ambiguity validation test methods, ratio test and difference test, from three aspects: acceptance region, probability basis and numerical results. The major contribution of this paper can be summarized as: (1) The ratio test acceptance region is overlap of ellipsoids while the difference test acceptance region is overlap of half-spaces. (2) The probability basis of these two popular tests is firstly analyzed. The difference test is an approximation to optimal integer aperture, while the ratio test follows an exponential relationship in probability. (3) The limitations of the two tests are firstly identified. The two tests may under-evaluate the failure risk if the model is not strong enough or the float ambiguities fall in particular region. (4) Extensive numerical results are used to compare the performance of these two tests. The simulation results show the ratio test outperforms the difference test in some models while difference test performs better in other models. Particularly in the medium baseline kinematic model, the difference tests outperforms the ratio test, the superiority is independent on frequency number, observation noise, satellite geometry, while it depends on success rate and failure rate tolerance. Smaller failure rate leads to larger performance discrepancy.
Resumo:
Informed broadly by the theory of planned behaviour, this study used qualitative methodology to understand Australian adults' sun-protective decisions. Forty-two adults participated in focus groups where they discussed behavioural (advantages and disadvantages), normative (important referents), and control (barriers and facilitators) beliefs, as well as potential social influences and images of tanned and non-tanned people. Responses were analysed using the consensual qualitative research approach to determine the dominant themes. Themes of fashion and comfort were prominent, the important role of friends and family in sun safe decision-making was highlighted, as was the availability of sun-protective measures (e.g., in an accessible place or in the environment). Additional themes included the need to model sound sun-protective behaviours to (current and future) children, the emphasis on personal choice and personal responsibility to be sun safe, and the influence of Australian identity and culture on tanning and socially acceptable forms of sun protection. These beliefs can be used to inform interventions and public health campaigns targeting sun safety among Australians, a population with the highest skin cancer incidence in the world.
Resumo:
This chapter describes the evolution of a model to propose the relationship between food literacy and nutrition. This model can also be used as a framework for program planning, implementation and evaluation. Practitioners and policy makers invest in food literacy with outcome expectations beyond diet quality. For this reason, a second model was developed to conceptualise the role of food literacy with respect to food security, body weight and chronic disease risk. This second model is useful in positioning food literacy within multi-strategic public health nutrition and chronic disease plans.
Resumo:
Refletindo sobre homossexualidade, Aids e seus desdobramentos sociais e subjetivos nos últimos 30 anos, procuramos nesta dissertação discutir o fenômeno bareback sexo sem camisinha - nomeado nos Estados Unidos na segunda metade da década de 1990. Sua disseminação na mídia tem causado, com frequência, reações que reconectam a homossexualidade a loucura, doença e morte. Devido à restrita produção acadêmica no Brasil, objetivamos contribuir por meio deste trabalho com algumas considerações essenciais ao debate. Percorremos alguns deslocamentos historicamente importantes relativos à homossexualidade, a condução das condutas - práticas de governo, risco, Aids e ao próprio bareback. Neste sentido, o trabalho associa um estudo teórico sobre este objeto a entrevistas realizadas na cidade do Rio de Janeiro. A pesquisa empírica exploratoria recolheu dados e discursos sobre este fenômeno em nossa realidade e contexto, tendo como terreno a Associação Brasileira Interdisciplinar de Aids (ABIA) e o Grupo Pela Vidda-RJ, duas organizações não-governamentais que trabalham com a Aids, e no Grupo Arco-íris, ONG integrante do Movimento LGBT. Duas pessoas de cada uma destas ONGs foram entrevistadas. Buscamos entender como essas instituições, locais privilegiados de nossa incursão, vêm abordando o fenômeno, quais suas posições e impressões. Paralelamente, contactamos alguns voluntários adeptos do sexo bareback, por considerarmos seus discursos indispensáveis e capazes de tornar este trabalho mais rico e diverso, no entendimento do bareback, a partir de suas experiências individuais. Para tal, utilizamos dois sites de bareback internacionais (barebackrt.com e bareback.com) que hospedam perfis de brasileiros, alguns residentes na cidade do Rio de Janeiro, onde três praticantes foram integrados à pesquisa. Nossa hipótese é que as tentativas em decifrar o bareback, dar-lhe um sentido, uma verdade, acabam percorrendo trilhas normativas que têm seus limites expostos à medida que percebemos que a diversidade das práticas erótico-sexuais, da singularidade e subjetividade dos sujeitos transcendem qualquer tentativa de normatização / normalização. Assim, acreditamos que o que chamamos de bareback, seja fenômeno, subcultura, prática ou comportamento, não pode ser definido enquanto conjunto coeso de discursos, fantasias e práticas erótico-sexuais, mas pelo contrário, apresenta-se por meio de múltiplas faces ainda mais variadas, restando apenas à alusão que lhe é característica: o sexo sem camisinha, que nem sempre significará sexo sem proteção. Desta forma, tendo como perspectiva a noção de condução das condutas e cuidado de si proposta por Michel Foucault, discutimos o significado das práticas sexuais dissidentes e as questões referentes a normalização, patologização e formas de resistência.
Resumo:
利用高效迭代牛顿-欧拉方法对一个21自由度的轮式移动仿人机器人进行了整体动力学建模,该模型虽然维数较高,但消除了分块建模中需要对模块之间相互作用力进行建模的难点问题,并且由于机器人双臂的对称结构,当合理规划双臂运动时,动力学模型将得到部分简化。本文还对某关节运动时在各个关节所产生的力或力矩进行了仿真分析。解析及仿真结果表明,合理规划上臂各关节的协调运动,将极大地削弱车体及腰部各关节所受的力或力矩扰动,为基于动力学的机器人运动控制以及稳定性分析提供理论依据。
Resumo:
The area of mortality modelling has received significant attention over the last 20 years owing to the need to quantify and forecast improving mortality rates. This need is driven primarily by the concern of governments, professionals, insurance and actuarial professionals and individuals to be able to fund their old age. In particular, to quantify the costs of increasing longevity we need suitable model of mortality rates that capture the dynamics of the data and forecast them with sufficient accuracy to make them useful. In this paper we test several of those models by considering the fitting quality and in particular, testing the residuals of those models for normality properties. In a wide ranging study considering 30 countries we find that almost exclusively the residuals do not demonstrate normality. Further, in Hurst tests of the residuals we find evidence that structure remains that is not captured by the models.
Resumo:
Low-velocity impact damage can drastically reduce the residual strength of a composite structure even when the damage is barely visible. The ability to computationally predict the extent of damage and compression-after-impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant time and cost penalties. A high-fidelity three-dimensional composite damage model, to predict both low-velocity impact damage and CAI strength of composite laminates, has been developed and implemented as a user material subroutine in the commercial finite element package, ABAQUS/Explicit. The intralaminar damage model component accounts for physically-based tensile and compressive failure mechanisms, of the fibres and matrix, when subjected to a three-dimensional stress state. Cohesive behaviour was employed to model the interlaminar failure between plies with a bi-linear traction–separation law for capturing damage onset and subsequent damage evolution. The virtual tests, set up in ABAQUS/Explicit, were executed in three steps, one to capture the impact damage, the second to stabilize the specimen by imposing new boundary conditions required for compression testing, and the third to predict the CAI strength. The observed intralaminar damage features, delamination damage area as well as residual strength are discussed. It is shown that the predicted results for impact damage and CAI strength correlated well with experimental testing without the need of model calibration which is often required with other damage models.
Resumo:
When developing software for autonomous mobile robots, one has to inevitably tackle some kind of perception. Moreover, when dealing with agents that possess some level of reasoning for executing their actions, there is the need to model the environment and the robot internal state in a way that it represents the scenario in which the robot operates. Inserted in the ATRI group, part of the IEETA research unit at Aveiro University, this work uses two of the projects of the group as test bed, particularly in the scenario of robotic soccer with real robots. With the main objective of developing algorithms for sensor and information fusion that could be used e ectively on these teams, several state of the art approaches were studied, implemented and adapted to each of the robot types. Within the MSL RoboCup team CAMBADA, the main focus was the perception of ball and obstacles, with the creation of models capable of providing extended information so that the reasoning of the robot can be ever more e ective. To achieve it, several methodologies were analyzed, implemented, compared and improved. Concerning the ball, an analysis of ltering methodologies for stabilization of its position and estimation of its velocity was performed. Also, with the goal keeper in mind, work has been done to provide it with information of aerial balls. As for obstacles, a new de nition of the way they are perceived by the vision and the type of information provided was created, as well as a methodology for identifying which of the obstacles are team mates. Also, a tracking algorithm was developed, which ultimately assigned each of the obstacles a unique identi er. Associated with the improvement of the obstacles perception, a new algorithm of estimating reactive obstacle avoidance was created. In the context of the SPL RoboCup team Portuguese Team, besides the inevitable adaptation of many of the algorithms already developed for sensor and information fusion and considering that it was recently created, the objective was to create a sustainable software architecture that could be the base for future modular development. The software architecture created is based on a series of di erent processes and the means of communication among them. All processes were created or adapted for the new architecture and a base set of roles and behaviors was de ned during this work to achieve a base functional framework. In terms of perception, the main focus was to de ne a projection model and camera pose extraction that could provide information in metric coordinates. The second main objective was to adapt the CAMBADA localization algorithm to work on the NAO robots, considering all the limitations it presents when comparing to the MSL team, especially in terms of computational resources. A set of support tools were developed or improved in order to support the test and development in both teams. In general, the work developed during this thesis improved the performance of the teams during play and also the e ectiveness of the developers team when in development and test phases.
Resumo:
La généralisation des acquis dans le domaine de l’agression sexuelle peut se subdiviser en deux volets, soit la généralisation qui se produit lors du traitement et celle suivant le retour dans la collectivité. Le modèle de traitement cognitivo-comportemental, basé sur les principes du risque, des besoins et de la réceptivité, permet une réduction significative des taux de récidive. Plus spécifiquement, les besoins criminogènes ciblés chez chacun des délinquants et le type de stratégies apprises en traitement peuvent influer sur le processus de généralisation des acquis. De la même façon, les caractéristiques propres à l’agresseur sexuel ont également un rôle à jouer. Lors de la libération, la considération et la mise en place de certaines mesures, telles que le plan de réinsertion sociale, les besoins sociaux et individuels, l’employabilité, le logement et la continuité thérapeutique importent afin de faciliter le maintien des acquis. Ainsi, le présent projet de maîtrise vise à mettre de l’avant une meilleure compréhension du phénomène de la généralisation des acquis chez quatre délinquants sexuels suivis dans la collectivité (Centre de psychiatrie légale de Montréal), à la suite d’un traitement d’un an complété à l’Institut Philippe-Pinel de Montréal. Dans le but de comprendre les facteurs pouvant favoriser ce processus, nous avons étudié la manière dont ces différents facteurs se sont présentés chez les délinquants sexuels à l’étude et l’impact lié à la présence ou à l’absence de ces variables. L’analyse clinique du matériel obtenu a démontré, d’une part, que la généralisation des acquis est facilitée lorsque l’ensemble des besoins criminogènes sont des cibles de traitement et que, d’autre part, le délinquant est en mesure d’appliquer des stratégies cognitivo-comportementales plutôt que des techniques purement cognitives. Par ailleurs, la présence d’impulsivité et de problèmes individuels non stabilisés peut nuire au processus. Finalement, il est ressorti que la généralisation des acquis est plus facilement atteignable lorsque les variables identifiées comme étant propices à une réinsertion sociale réussie sont présentes dans le quotidien des délinquants.
Resumo:
La diffusion sur les plateformes néomédiatiques d’œuvres audiovisuelles, comme les sites Internet des télédiffuseurs ou des webdiffuseurs, la vidéo sur demande, la télévision mobile ou la webdistribution, modifie les risques que les producteurs audiovisuels doivent gérer normalement sur les plateformes traditionnelles, comme la télévision. La mutation des risques découle de quatre sources en particulier, soit du marché, des pratiques d’affaires, des lois et règlements et des techniques elles-mêmes. Ces sources peuvent également induire des normes pouvant constituer un cadre juridique afin de moduler ou éliminer les risques. Le présent mémoire analyse les risques encourus lors de la diffusion sur les plateformes néomédiatiques d’œuvres audiovisuelles du point de vue des producteurs par l’entremise du processus de gestion de risques. Il identifie et recense ainsi les risques en mutation et les nouveaux risques auxquels les producteurs sont confrontés. Puis, les risques identifiés y sont définis et le cadre juridique est abordé dans le contexte de la mise en œuvre d’une stratégie de gestion de risques et des mesures afin d’atténuer ou d’éviter les risques encourus par les activités de production et d’exploitation des producteurs.
Resumo:
La dependencia entre las series financieras, es un parámetro fundamental para la estimación de modelos de Riesgo. El Valor en Riesgo (VaR) es una de las medidas más importantes utilizadas para la administración y gestión de Riesgos Financieros, en la actualidad existen diferentes métodos para su estimación, como el método por simulación histórica, el cual no asume ninguna distribución sobre los retornos de los factores de riesgo o activos, o los métodos paramétricos que asumen normalidad sobre las distribuciones. En este documento se introduce la teoría de cópulas, como medida de dependencia entre las series, se estima un modelo ARMA-GARCH-Cópula para el cálculo del Valor en Riesgo de un portafolio compuesto por dos series financiera, la tasa de cambio Dólar-Peso y Euro-Peso. Los resultados obtenidos muestran que la estimación del VaR por medio de copulas es más preciso en relación a los métodos tradicionales.
Resumo:
Few studies exist on the types of characteristics associated with service utilization (e.g., shelters, food programs) among homeless youth in the U.S. Services are important, however, because without food and shelter, numerous homeless youth resort to trading sex in order to meet their daily survival needs. Access to physical and mental health services gives homeless youth more of an opportunity to integrate into mainstream society than they would otherwise have. To address this gap in our understanding, my study examines what traits (e.g. age, race, abuse history) correlate with the use of shelters, food programs, street outreach, counseling, STD/STI testing, and HIV testing among homeless youth. The Theory of Reasoned Action is used as an ideological framework in conjunction with theoretical constructs of risk, need, and prior service exposure. Data were obtained from the Social Network and Homeless Youth Project (SNHYP), a sample of 249 Midwestern homeless youth ages 14 to 21, which used trained interviewers to conduct structured interviews with youth. Respondents were interviewed in both shelters and on the street over a period of approximately one year. My findings revealed that homeless youth’s service usage varied across gender, sexual orientation, age, having recently held a job, and having ever been physically or sexually abused, in addition to other characteristics. Conversely, service use was not associated with social network size or subjective norms (i.e. attitudes of peers, such as acceptance of condom use) of youths’ social networks. By examining these areas, my study builds on previous research on homeless youth and lays the framework for future research on service utilization by homeless youth.
Resumo:
In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.
Resumo:
BACKGROUND Renal cell carcinoma (RCC) is marked by high mortality rate. To date, no robust risk stratification by clinical or molecular prognosticators of cancer-specific survival (CSS) has been established for early stages. Transcriptional profiling of small non-coding RNA gene products (miRNAs) seems promising for prognostic stratification. The expression of miR-21 and miR-126 was analysed in a large cohort of RCC patients; a combined risk score (CRS)-model was constructed based on expression levels of both miRNAs. METHODS Expression of miR-21 and miR-126 was evaluated by qRT-PCR in tumour and adjacent non-neoplastic tissue in n = 139 clear cell RCC patients. Relation of miR-21 and miR-126 expression with various clinical parameters was assessed. Parameters were analysed by uni- and multivariate COX regression. A factor derived from the z-score resulting from the COX model was determined for both miRs separately and a combined risk score (CRS) was calculated multiplying the relative expression of miR-21 and miR-126 by this factor. The best fitting COX model was selected by relative goodness-of-fit with the Akaike information criterion (AIC). RESULTS RCC with and without miR-21 up- and miR-126 downregulation differed significantly in synchronous metastatic status and CSS. Upregulation of miR-21 and downregulation of miR-126 were independently prognostic. A combined risk score (CRS) based on the expression of both miRs showed high sensitivity and specificity in predicting CSS and prediction was independent from any other clinico-pathological parameter. Association of CRS with CSS was successfully validated in a testing cohort containing patients with high and low risk for progressive disease. CONCLUSIONS A combined expression level of miR-21 and miR-126 accurately predicted CSS in two independent RCC cohorts and seems feasible for clinical application in assessing prognosis.
Resumo:
Based on our needs, that is to say, through precise simulation of the impact phenomena that may occur inside a jet engine turbine with an explicit non-linear finite element code, four new material models are postulated. Each one of is calibrated for four high-performance alloys that can be encountered in a modern jet engine. A new uncoupled material model for high strain and ballistic is proposed. Based on a Johnson-Cook type model, the proposed formulation introduces the effect of the third deviatoric invariant by means of three different Lode angle dependent functions. The Lode dependent functions are added to both plasticity and failure models. The postulated model is calibrated for a 6061-T651 aluminium alloy with data taken from the literature. The fracture pattern predictability of the JCX material model is shown performing numerical simulations of various quasi-static and dynamic tests. As an extension of the above-mentioned model, a modification in the thermal softening behaviour due to phase transformation temperatures is developed (JCXt). Additionally, a Lode angle dependent flow stress is defined. Analysing the phase diagram and high temperature tests performed, phase transformation temperatures of the FV535 stainless steel are determined. The postulated material model constants for the FV535 stainless steel are calibrated. A coupled elastoplastic-damage material model for high strain and ballistic applications is presented (JCXd). A Lode angle dependent function is added to the equivalent plastic strain to failure definition of the Johnson-Cook failure criterion. The weakening in the elastic law and in the Johnson-Cook type constitutive relation implicitly introduces the Lode angle dependency in the elastoplastic behaviour. The material model is calibrated for precipitation hardened Inconel 718 nickel-base superalloy. The combination of a Lode angle dependent failure criterion with weakened constitutive equations is proven to predict fracture patterns of the mechanical tests performed and provide reliable results. A transversely isotropic material model for directionally solidified alloys is presented. The proposed yield function is based a single linear transformation of the stress tensor. The linear operator weighs the degree of anisotropy of the yield function. The elastic behaviour, as well as the hardening, are considered isotropic. To model the hardening, a Johnson-Cook type relation is adopted. A material vector is included in the model implementation. The failure is modelled with the Cockroft-Latham failure criterion. The material vector allows orienting the reference orientation in any other that the user may need. The model is calibrated for the MAR-M 247 directionally solidified nickel-base superalloy.