884 resultados para Problem analysis
Resumo:
OBJECTIVE: Caring for a loved one with Alzheimer disease is a highly stressful experience that is associated with significant depressive symptoms. Previous studies indicate a positive association between problem behaviors in patients with Alzheimer disease (e.g., repeating questions, restlessness, and agitation) and depressive symptoms in their caregivers. Moreover, the extant literature indicates a robust negative relationship between escape-avoidance coping (i.e., avoiding people, wishing the situation would go away) and psychiatric well-being. The purpose of this study was to test a mediational model of the associations between patient problem behaviors, escape-avoidance coping, and depressive symptoms in Alzheimer caregivers. METHODS: Ninety-five spousal caregivers (mean age: 72 years) completed measures assessing their loved ones' frequency of problem behaviors, escape-avoidance coping, and depressive symptoms. A mediational model was tested to determine if escape-avoidant coping partially mediated the relationship between patient problem behaviors and caregiver depressive symptoms. RESULTS: Patient problem behaviors were positively associated with escape-avoidance coping (beta = 0.38, p < 0.01) and depressive symptoms (beta = 0.26, p < 0.05). Escape-avoidance coping was positively associated with depressive symptoms (beta = 0.33, p < 0.01). In a final regression analysis, the impact of problem behaviors on depressive symptoms was less after controlling for escape-avoidance coping. Sobel's test confirmed that escape-avoidance coping significantly mediated the relationship between problem behaviors and depressive symptoms (z = 2.07, p < 0.05). CONCLUSION: Escape-avoidance coping partially mediates the association between patient problem behaviors and depressive symptoms among elderly caregivers of spouses with dementia. This finding provides a specific target for psychosocial interventions for caregivers.
Resumo:
Studies of chronic life-threatening diseases often involve both mortality and morbidity. In observational studies, the data may also be subject to administrative left truncation and right censoring. Since mortality and morbidity may be correlated and mortality may censor morbidity, the Lynden-Bell estimator for left truncated and right censored data may be biased for estimating the marginal survival function of the non-terminal event. We propose a semiparametric estimator for this survival function based on a joint model for the two time-to-event variables, which utilizes the gamma frailty specification in the region of the observable data. Firstly, we develop a novel estimator for the gamma frailty parameter under left truncation. Using this estimator, we then derive a closed form estimator for the marginal distribution of the non-terminal event. The large sample properties of the estimators are established via asymptotic theory. The methodology performs well with moderate sample sizes, both in simulations and in an analysis of data from a diabetes registry.
Resumo:
We are concerned with the estimation of the exterior surface of tube-shaped anatomical structures. This interest is motivated by two distinct scientific goals, one dealing with the distribution of HIV microbicide in the colon and the other with measuring degradation in white-matter tracts in the brain. Our problem is posed as the estimation of the support of a distribution in three dimensions from a sample from that distribution, possibly measured with error. We propose a novel tube-fitting algorithm to construct such estimators. Further, we conduct a simulation study to aid in the choice of a key parameter of the algorithm, and we test our algorithm with validation study tailored to the motivating data sets. Finally, we apply the tube-fitting algorithm to a colon image produced by single photon emission computed tomography (SPECT)and to a white-matter tract image produced using diffusion tensor `imaging (DTI).
Resumo:
Recent developments in clinical radiology have resulted in additional developments in the field of forensic radiology. After implementation of cross-sectional radiology and optical surface documentation in forensic medicine, difficulties in the validation and analysis of the acquired data was experienced. To address this problem and for the comparison of autopsy and radiological data a centralized database with internet technology for forensic cases was created. The main goals of the database are (1) creation of a digital and standardized documentation tool for forensic-radiological and pathological findings; (2) establishing a basis for validation of forensic cross-sectional radiology as a non-invasive examination method in forensic medicine that means comparing and evaluating the radiological and autopsy data and analyzing the accuracy of such data; and (3) providing a conduit for continuing research and education in forensic medicine. Considering the infrequent availability of CT or MRI for forensic institutions and the heterogeneous nature of case material in forensic medicine an evaluation of benefits and limitations of cross-sectional imaging concerning certain forensic features by a single institution may be of limited value. A centralized database permitting international forensic and cross disciplinary collaborations may provide important support for forensic-radiological casework and research.
Resumo:
An important problem in computational biology is finding the longest common subsequence (LCS) of two nucleotide sequences. This paper examines the correctness and performance of a recently proposed parallel LCS algorithm that uses successor tables and pruning rules to construct a list of sets from which an LCS can be easily reconstructed. Counterexamples are given for two pruning rules that were given with the original algorithm. Because of these errors, performance measurements originally reported cannot be validated. The work presented here shows that speedup can be reliably achieved by an implementation in Unified Parallel C that runs on an Infiniband cluster. This performance is partly facilitated by exploiting the software cache of the MuPC runtime system. In addition, this implementation achieved speedup without bulk memory copy operations and the associated programming complexity of message passing.
Resumo:
The goal of this research is to provide a framework for vibro-acoustical analysis and design of a multiple-layer constrained damping structure. The existing research on damping and viscoelastic damping mechanism is limited to the following four mainstream approaches: modeling techniques of damping treatments/materials; control through the electrical-mechanical effect using the piezoelectric layer; optimization by adjusting the parameters of the structure to meet the design requirements; and identification of the damping material’s properties through the response of the structure. This research proposes a systematic design methodology for the multiple-layer constrained damping beam giving consideration to vibro-acoustics. A modeling technique to study the vibro-acoustics of multiple-layered viscoelastic laminated beams using the Biot damping model is presented using a hybrid numerical model. The boundary element method (BEM) is used to model the acoustical cavity whereas the Finite Element Method (FEM) is the basis for vibration analysis of the multiple-layered beam structure. Through the proposed procedure, the analysis can easily be extended to other complex geometry with arbitrary boundary conditions. The nonlinear behavior of viscoelastic damping materials is represented by the Biot damping model taking into account the effects of frequency, temperature and different damping materials for individual layers. A curve-fitting procedure used to obtain the Biot constants for different damping materials for each temperature is explained. The results from structural vibration analysis for selected beams agree with published closed-form results and results for the radiated noise for a sample beam structure obtained using a commercial BEM software is compared with the acoustical results of the same beam with using the Biot damping model. The extension of the Biot damping model is demonstrated to study MDOF (Multiple Degrees of Freedom) dynamics equations of a discrete system in order to introduce different types of viscoelastic damping materials. The mechanical properties of viscoelastic damping materials such as shear modulus and loss factor change with respect to different ambient temperatures and frequencies. The application of multiple-layer treatment increases the damping characteristic of the structure significantly and thus helps to attenuate the vibration and noise for a broad range of frequency and temperature. The main contributions of this dissertation include the following three major tasks: 1) Study of the viscoelastic damping mechanism and the dynamics equation of a multilayer damped system incorporating the Biot damping model. 2) Building the Finite Element Method (FEM) model of the multiple-layer constrained viscoelastic damping beam and conducting the vibration analysis. 3) Extending the vibration problem to the Boundary Element Method (BEM) based acoustical problem and comparing the results with commercial simulation software.
Resumo:
An extrusion die is used to continuously produce parts with a constant cross section; such as sheets, pipes, tire components and more complex shapes such as window seals. The die is fed by a screw extruder when polymers are used. The extruder melts, mixes and pressures the material by the rotation of either a single or double screw. The polymer can then be continuously forced through the die producing a long part in the shape of the die outlet. The extruded section is then cut to the desired length. Generally, the primary target of a well designed die is to produce a uniform outlet velocity without excessively raising the pressure required to extrude the polymer through the die. Other properties such as temperature uniformity and residence time are also important but are not directly considered in this work. Designing dies for optimal outlet velocity variation using simple analytical equations are feasible for basic die geometries or simple channels. Due to the complexity of die geometry and of polymer material properties design of complex dies by analytical methods is difficult. For complex dies iterative methods must be used to optimize dies. An automated iterative method is desired for die optimization. To automate the design and optimization of an extrusion die two issues must be dealt with. The first is how to generate a new mesh for each iteration. In this work, this is approached by modifying a Parasolid file that describes a CAD part. This file is then used in a commercial meshing software. Skewing the initial mesh to produce a new geometry was also employed as a second option. The second issue is an optimization problem with the presence of noise stemming from variations in the mesh and cumulative truncation errors. In this work a simplex method and a modified trust region method were employed for automated optimization of die geometries. For the trust region a discreet derivative and a BFGS Hessian approximation were used. To deal with the noise in the function the trust region method was modified to automatically adjust the discreet derivative step size and the trust region based on changes in noise and function contour. Generally uniformity of velocity at exit of the extrusion die can be improved by increasing resistance across the die but this is limited by the pressure capabilities of the extruder. In optimization, a penalty factor that increases exponentially from the pressure limit is applied. This penalty can be applied in two different ways; the first only to the designs which exceed the pressure limit, the second to both designs above and below the pressure limit. Both of these methods were tested and compared in this work.
Resumo:
Fuel Cells are a promising alternative energy technology. One of the biggest problems that exists in fuel cell is that of water management. A better understanding of wettability characteristics in the fuel cells is needed to alleviate the problem of water management. Contact angle data on gas diffusion layers (GDL) of the fuel cells can be used to characterize the wettability of GDL in fuel cells. A contact angle measurement program has been developed to measure the contact angle of sessile drops from drop images. Digitization of drop images induces pixel errors in the contact angle measurement process. The resulting uncertainty in contact angle measurement has been analyzed. An experimental apparatus has been developed for contact angle measurements at different temperature, with the feature to measure advancing and receding contact angles on gas diffusion layers of fuel cells.
Resumo:
The developmental processes and functions of an organism are controlled by the genes and the proteins that are derived from these genes. The identification of key genes and the reconstruction of gene networks can provide a model to help us understand the regulatory mechanisms for the initiation and progression of biological processes or functional abnormalities (e.g. diseases) in living organisms. In this dissertation, I have developed statistical methods to identify the genes and transcription factors (TFs) involved in biological processes, constructed their regulatory networks, and also evaluated some existing association methods to find robust methods for coexpression analyses. Two kinds of data sets were used for this work: genotype data and gene expression microarray data. On the basis of these data sets, this dissertation has two major parts, together forming six chapters. The first part deals with developing association methods for rare variants using genotype data (chapter 4 and 5). The second part deals with developing and/or evaluating statistical methods to identify genes and TFs involved in biological processes, and construction of their regulatory networks using gene expression data (chapter 2, 3, and 6). For the first part, I have developed two methods to find the groupwise association of rare variants with given diseases or traits. The first method is based on kernel machine learning and can be applied to both quantitative as well as qualitative traits. Simulation results showed that the proposed method has improved power over the existing weighted sum method (WS) in most settings. The second method uses multiple phenotypes to select a few top significant genes. It then finds the association of each gene with each phenotype while controlling the population stratification by adjusting the data for ancestry using principal components. This method was applied to GAW 17 data and was able to find several disease risk genes. For the second part, I have worked on three problems. First problem involved evaluation of eight gene association methods. A very comprehensive comparison of these methods with further analysis clearly demonstrates the distinct and common performance of these eight gene association methods. For the second problem, an algorithm named the bottom-up graphical Gaussian model was developed to identify the TFs that regulate pathway genes and reconstruct their hierarchical regulatory networks. This algorithm has produced very significant results and it is the first report to produce such hierarchical networks for these pathways. The third problem dealt with developing another algorithm called the top-down graphical Gaussian model that identifies the network governed by a specific TF. The network produced by the algorithm is proven to be of very high accuracy.
Resumo:
Asthma is an increasing health problem worldwide, but the long-term temporal pattern of clinical symptoms is not understood and predicting asthma episodes is not generally possible. We analyse the time series of peak expiratory flows, a standard measurement of airway function that has been assessed twice daily in a large asthmatic population during a long-term crossover clinical trial. Here we introduce an approach to predict the risk of worsening airflow obstruction by calculating the conditional probability that, given the current airway condition, a severe obstruction will occur within 30 days. We find that, compared with a placebo, a regular long-acting bronchodilator (salmeterol) that is widely used to improve asthma control decreases the risk of airway obstruction. Unexpectedly, however, a regular short-acting beta2-agonist bronchodilator (albuterol) increases this risk. Furthermore, we find that the time series of peak expiratory flows show long-range correlations that change significantly with disease severity, approaching a random process with increased variability in the most severe cases. Using a nonlinear stochastic model, we show that both the increased variability and the loss of correlations augment the risk of unstable airway function. The characterization of fluctuations in airway function provides a quantitative basis for objective risk prediction of asthma episodes and for evaluating the effectiveness of therapy.
Resource-allocation capabilities of commercial project management software. An experimental analysis
Resumo:
When project managers determine schedules for resource-constrained projects, they commonly use commercial project management software packages. Which resource-allocation methods are implemented in these packages is proprietary information. The resource-allocation problem is in general computationally difficult to solve to optimality. Hence, the question arises if and how various project management software packages differ in quality with respect to their resource-allocation capabilities. None of the few existing papers on this subject uses a sizeable data set and recent versions of common software packages. We experimentally analyze the resource-allocation capabilities of Acos Plus.1, AdeptTracker Professional, CS Project Professional, Microsoft Office Project 2007, Primavera P6, Sciforma PS8, and Turbo Project Professional. Our analysis is based on 1560 instances of the precedence- and resource-constrained project scheduling problem RCPSP. The experiment shows that using the resource-allocation feature of these packages may lead to a project duration increase of almost 115% above the best known feasible schedule. The increase gets larger with increasing resource scarcity and with increasing number of activities. We investigate the impact of different complexity scenarios and priority rules on the project duration obtained by the software packages. We provide a decision table to support managers in selecting a software package and a priority rule.
Resumo:
Balancing the frequently conflicting priorities of conservation and economic development poses a challenge to management of the Swiss Alps Jungfrau-Aletsch World Heritage Site (WHS). This is a complex societal problem that calls for a knowledge-based solution. This in turn requires a transdisciplinary research framework in which problems are defined and solved cooperatively by actors from the scientific community and the life-world. In this article we re-examine studies carried out in the region of the Swiss Alps Jungfrau-Aletsch WHS, covering three key issues prevalent in transdisciplinary settings: integration of stakeholders into participatory processes; perceptions and positions; and negotiability and implementation. In the case of the Swiss Alps Jungfrau-Aletsch WHS the transdisciplinary setting created a situation of mutual learning among stakeholders from different levels and backgrounds. However, the studies showed that the benefits of such processes of mutual learning are continuously at risk of being diminished by the power play inherent in participatory approaches.
Resumo:
The present article describes and analyses youth criminality in the city of Rosario, Argentina between the years 2003-2006. Key actors’ understandings of and responses to the conflict were investigated by means of semi-structured interviews, observations, discourse analysis of policy documents, analysis of secondary data, and draw heavily on the experience of the author, a citizen and youth worker of Rosario. The actors examined were the police, the local government, young delinquents and youth organisations. Youth criminality is analysed from a conflict transformation approach using conflict analysis tools. Whereas, the provincial police understand the issue as a delinquency problem, other actors perceive it as an expression of a wider urban social conflict between those that are “included” and those that are “excluded” and as one of the negative effects of globalisation processes. The results suggest that police responses addressing only direct violence are ineffective, even contributing to increased tensions and polarisation, whereas strategies addressing cultural and structural violence are more suitable for this type of social urban conflict. Finally, recommendations for local youth policy are proposed to facilitate participation and inclusion of youth and as a tool for peaceful conflict transformation.
Resumo:
Increasing demand for marketing accountability requires an efficient allocation of marketing expenditures. Managers who know the elasticity of their marketing instruments can allocate their budgets optimally. Meta-analyses offer a basis for deriving benchmark elasticities for advertising. Although they provide a variety of valuable insights, a major shortcoming of prior meta-analyses is that they report only generalized results as the disaggregated raw data are not made available. This problem is highly relevant because coding of empirical studies, at least to a certain extent, involves subjective judgment. For this reason, meta-studies would be more valuable if researchers and practitioners had access to disaggregated data allowing them to conduct further analyses of individual, e.g., product-level-specific, interests. We are the first to address this gap by providing (1) an advertising elasticity database (AED) and (2) empirical generalizations about advertising elasticities and their determinants. Our findings indicate that the average current-period advertising elasticity is 0.09, which is substantially smaller than the value 0f 0.12 that was recently reported by Sethuraman, Tellis, and Briesch (2011). Furthermore, our meta-analysis reveals a wide range of significant determinants of advertising elasticity. For example, we find that advertising elasticities are higher (i) for hedonic and experience goods than for other goods; (ii) for new than for established goods; (iii) when advertising is measured in gross rating points (GRP) instead of absolute terms; and (iv) when the lagged dependent or lagged advertising variable is omitted.
Resumo:
ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.