31 resultados para Framework Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many methodologies dealing with prediction or simulation of soft tissue deformations on medical image data require preprocessing of the data in order to produce a different shape representation that complies with standard methodologies, such as mass–spring networks, finite element method s (FEM). On the other hand, methodologies working directly on the image space normally do not take into account mechanical behavior of tissues and tend to lack physics foundations driving soft tissue deformations. This chapter presents a method to simulate soft tissue deformations based on coupled concepts from image analysis and mechanics theory. The proposed methodology is based on a robust stochastic approach that takes into account material properties retrieved directly from the image, concepts from continuum mechanics and FEM. The optimization framework is solved within a hierarchical Markov random field (HMRF) which is implemented on the graphics processor unit (GPU See Graphics processing unit ).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of our study was to develop a modeling framework suitable to quantify the incidence, absolute number and economic impact of osteoporosis-attributable hip, vertebral and distal forearm fractures, with a particular focus on change over time, and with application to the situation in Switzerland from 2000 to 2020. A Markov process model was developed and analyzed by Monte Carlo simulation. A demographic scenario provided by the Swiss Federal Statistical Office and various Swiss and international data sources were used as model inputs. Demographic and epidemiologic input parameters were reproduced correctly, confirming the internal validity of the model. The proportion of the Swiss population aged 50 years or over will rise from 33.3% in 2000 to 41.3% in 2020. At the total population level, osteoporosis-attributable incidence will rise from 1.16 to 1.54 per 1,000 person-years in the case of hip fracture, from 3.28 to 4.18 per 1,000 person-years in the case of radiographic vertebral fracture, and from 0.59 to 0.70 per 1,000 person-years in the case of distal forearm fracture. Osteoporosis-attributable hip fracture numbers will rise from 8,375 to 11,353, vertebral fracture numbers will rise from 23,584 to 30,883, and distal forearm fracture numbers will rise from 4,209 to 5,186. Population-level osteoporosis-related direct medical inpatient costs per year will rise from 713.4 million Swiss francs (CHF) to CHF946.2 million. These figures correspond to 1.6% and 2.2% of Swiss health care expenditures in 2000. The modeling framework described can be applied to a wide variety of settings. It can be used to assess the impact of new prevention, diagnostic and treatment strategies. In Switzerland incidences of osteoporotic hip, vertebral and distal forearm fracture will rise by 33%, 27%, and 19%, respectively, between 2000 and 2020, if current prevention and treatment patterns are maintained. Corresponding absolute fracture numbers will rise by 36%, 31%, and 23%. Related direct medical inpatient costs are predicted to increase by 33%; however, this estimate is subject to uncertainty due to limited availability of input data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The apical-basal axis of the early plant embryo determines the body plan of the adult organism. To establish a polarized embryonic axis, plants evolved a unique mechanism that involves directional, cell-to-cell transport of the growth regulator auxin. Auxin transport relies on PIN auxin transporters 1], whose polar subcellular localization determines the flow directionality. PIN-mediated auxin transport mediates the spatial and temporal activity of the auxin response machinery 2-7] that contributes to embryo patterning processes, including establishment of the apical (shoot) and basal (root) embryo poles 8]. However, little is known of upstream mechanisms guiding the (re)polarization of auxin fluxes during embryogenesis 9]. Here, we developed a model of plant embryogenesis that correctly generates emergent cell polarities and auxin-mediated sequential initiation of apical-basal axis of plant embryo. The model relies on two precisely localized auxin sources and a feedback between auxin and the polar, subcellular PIN transporter localization. Simulations reproduced PIN polarity and auxin distribution, as well as previously unknown polarization events during early embryogenesis. The spectrum of validated model predictions suggests that our model corresponds to a minimal mechanistic framework for initiation and orientation of the apical-basal axis to guide both embryonic and postembryonic plant development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given a reproducing kernel Hilbert space (H,〈.,.〉)(H,〈.,.〉) of real-valued functions and a suitable measure μμ over the source space D⊂RD⊂R, we decompose HH as the sum of a subspace of centered functions for μμ and its orthogonal in HH. This decomposition leads to a special case of ANOVA kernels, for which the functional ANOVA representation of the best predictor can be elegantly derived, either in an interpolation or regularization framework. The proposed kernels appear to be particularly convenient for analyzing the effect of each (group of) variable(s) and computing sensitivity indices without recursivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of biomarkers to infer drug response in patients is being actively pursued, yet significant challenges with this approach, including the complicated interconnection of pathways, have limited its application. Direct empirical testing of tumor sensitivity would arguably provide a more reliable predictive value, although it has garnered little attention largely due to the technical difficulties associated with this approach. We hypothesize that the application of recently developed microtechnologies, coupled to more complex 3-dimensional cell cultures, could provide a model to address some of these issues. As a proof of concept, we developed a microfluidic device where spheroids of the serous epithelial ovarian cancer cell line TOV112D are entrapped and assayed for their chemoresponse to carboplatin and paclitaxel, two therapeutic agents routinely used for the treatment of ovarian cancer. In order to index the chemoresponse, we analyzed the spatiotemporal evolution of the mortality fraction, as judged by vital dyes and confocal microscopy, within spheroids subjected to different drug concentrations and treatment durations inside the microfluidic device. To reflect microenvironment effects, we tested the effect of exogenous extracellular matrix and serum supplementation during spheroid formation on their chemotherapeutic response. Spheroids displayed augmented chemoresistance in comparison to monolayer culturing. This resistance was further increased by the simultaneous presence of both extracellular matrix and high serum concentration during spheroid formation. Following exposure to chemotherapeutics, cell death profiles were not uniform throughout the spheroid. The highest cell death fraction was found at the center of the spheroid and the lowest at the periphery. Collectively, the results demonstrate the validity of the approach, and provide the basis for further investigation of chemotherapeutic responses in ovarian cancer using microfluidics technology. In the future, such microdevices could provide the framework to assay drug sensitivity in a timeframe suitable for clinical decision making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water-conducting faults and fractures were studied in the granite-hosted A¨ spo¨ Hard Rock Laboratory (SE Sweden). On a scale of decametres and larger, steeply dipping faults dominate and contain a variety of different fault rocks (mylonites, cataclasites, fault gouges). On a smaller scale, somewhat less regular fracture patterns were found. Conceptual models of the fault and fracture geometries and of the properties of rock types adjacent to fractures were derived and used as input for the modelling of in situ dipole tracer tests that were conducted in the framework of the Tracer Retention Understanding Experiment (TRUE-1) on a scale of metres. After the identification of all relevant transport and retardation processes, blind predictions of the breakthroughs of conservative to moderately sorbing tracers were calculated and then compared with the experimental data. This paper provides the geological basis and model calibration, while the predictive and inverse modelling work is the topic of the companion paper [J. Contam. Hydrol. 61 (2003) 175]. The TRUE-1 experimental volume is highly fractured and contains the same types of fault rocks and alterations as on the decametric scale. The experimental flow field was modelled on the basis of a 2D-streamtube formalism with an underlying homogeneous and isotropic transmissivity field. Tracer transport was modelled using the dual porosity medium approach, which is linked to the flow model by the flow porosity. Given the substantial pumping rates in the extraction borehole, the transport domain has a maximum width of a few centimetres only. It is concluded that both the uncertainty with regard to the length of individual fractures and the detailed geometry of the network along the flowpath between injection and extraction boreholes are not critical because flow is largely one-dimensional, whether through a single fracture or a network. Process identification and model calibration were based on a single uranine breakthrough (test PDT3), which clearly showed that matrix diffusion had to be included in the model even over the short experimental time scales, evidenced by a characteristic shape of the trailing edge of the breakthrough curve. Using the geological information and therefore considering limited matrix diffusion into a thin fault gouge horizon resulted in a good fit to the experiment. On the other hand, fresh granite was found not to interact noticeably with the tracers over the time scales of the experiments. While fracture-filling gouge materials are very efficient in retarding tracers over short periods of time (hours–days), their volume is very small and, with time progressing, retardation will be dominated by altered wall rock and, finally, by fresh granite. In such rocks, both porosity (and therefore the effective diffusion coefficient) and sorption Kds are more than one order of magnitude smaller compared to fault gouge, thus indicating that long-term retardation is expected to occur but to be less pronounced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Proper delineation of ocular anatomy in 3D imaging is a big challenge, particularly when developing treatment plans for ocular diseases. Magnetic Resonance Imaging (MRI) is nowadays utilized in clinical practice for the diagnosis confirmation and treatment planning of retinoblastoma in infants, where it serves as a source of information, complementary to the Fundus or Ultrasound imaging. Here we present a framework to fully automatically segment the eye anatomy in the MRI based on 3D Active Shape Models (ASM), we validate the results and present a proof of concept to automatically segment pathological eyes. Material and Methods: Manual and automatic segmentation were performed on 24 images of healthy children eyes (3.29±2.15 years). Imaging was performed using a 3T MRI scanner. The ASM comprises the lens, the vitreous humor, the sclera and the cornea. The model was fitted by first automatically detecting the position of the eye center, the lens and the optic nerve, then aligning the model and fitting it to the patient. We validated our segmentation method using a leave-one-out cross validation. The segmentation results were evaluated by measuring the overlap using the Dice Similarity Coefficient (DSC) and the mean distance error. Results: We obtained a DSC of 94.90±2.12% for the sclera and the cornea, 94.72±1.89% for the vitreous humor and 85.16±4.91% for the lens. The mean distance error was 0.26±0.09mm. The entire process took 14s on average per eye. Conclusion: We provide a reliable and accurate tool that enables clinicians to automatically segment the sclera, the cornea, the vitreous humor and the lens using MRI. We additionally present a proof of concept for fully automatically segmenting pathological eyes. This tool reduces the time needed for eye shape delineation and thus can help clinicians when planning eye treatment and confirming the extent of the tumor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, reconstruction of three-dimensional (3D) patient-specific models of a hip joint from two-dimensional (2D) calibrated X-ray images is addressed. Existing 2D-3D reconstruction techniques usually reconstruct a patient-specific model of a single anatomical structure without considering the relationship to its neighboring structures. Thus, when those techniques would be applied to reconstruction of patient-specific models of a hip joint, the reconstructed models may penetrate each other due to narrowness of the hip joint space and hence do not represent a true hip joint of the patient. To address this problem we propose a novel 2D-3D reconstruction framework using an articulated statistical shape model (aSSM). Different from previous work on constructing an aSSM, where the joint posture is modeled as articulation in a training set via statistical analysis, here it is modeled as a parametrized rotation of the femur around the joint center. The exact rotation of the hip joint as well as the patient-specific models of the joint structures, i.e., the proximal femur and the pelvis, are then estimated by optimally fitting the aSSM to a limited number of calibrated X-ray images. Taking models segmented from CT data as the ground truth, we conducted validation experiments on both plastic and cadaveric bones. Qualitatively, the experimental results demonstrated that the proposed 2D-3D reconstruction framework preserved the hip joint structure and no model penetration was found. Quantitatively, average reconstruction errors of 1.9 mm and 1.1 mm were found for the pelvis and the proximal femur, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Complete-pelvis segmentation in antero-posterior pelvic radiographs is required to create a patient-specific three-dimensional pelvis model for surgical planning and postoperative assessment in image-free navigation of total hip arthroplasty. Methods A fast and robust framework for accurately segmenting the complete pelvis is presented, consisting of two consecutive modules. In the first module, a three-stage method was developed to delineate the left hemipelvis based on statistical appearance and shape models. To handle complex pelvic structures, anatomy-specific information processing techniques were employed. As the input to the second module, the delineated left hemi-pelvis was then reflected about an estimated symmetry line of the radiograph to initialize the right hemi-pelvis segmentation. The right hemi-pelvis was segmented by the same three-stage method, Results Two experiments conducted on respectively 143 and 40 AP radiographs demonstrated a mean segmentation accuracy of 1.61±0.68 mm. A clinical study to investigate the postoperative assessment of acetabular cup orientations based on the proposed framework revealed an average accuracy of 1.2°±0.9° and 1.6°±1.4° for anteversion and inclination, respectively. Delineation of each radiograph costs less than one minute. Conclusions Despite further validation needed, the preliminary results implied the underlying clinical applicability of the proposed framework for image-free THA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nematode Caenorhabditis elegans is a well-known model organism used to investigate fundamental questions in biology. Motility assays of this small roundworm are designed to study the relationships between genes and behavior. Commonly, motility analysis is used to classify nematode movements and characterize them quantitatively. Over the past years, C. elegans' motility has been studied across a wide range of environments, including crawling on substrates, swimming in fluids, and locomoting through microfluidic substrates. However, each environment often requires customized image processing tools relying on heuristic parameter tuning. In the present study, we propose a novel Multi-Environment Model Estimation (MEME) framework for automated image segmentation that is versatile across various environments. The MEME platform is constructed around the concept of Mixture of Gaussian (MOG) models, where statistical models for both the background environment and the nematode appearance are explicitly learned and used to accurately segment a target nematode. Our method is designed to simplify the burden often imposed on users; here, only a single image which includes a nematode in its environment must be provided for model learning. In addition, our platform enables the extraction of nematode ‘skeletons’ for straightforward motility quantification. We test our algorithm on various locomotive environments and compare performances with an intensity-based thresholding method. Overall, MEME outperforms the threshold-based approach for the overwhelming majority of cases examined. Ultimately, MEME provides researchers with an attractive platform for C. elegans' segmentation and ‘skeletonizing’ across a wide range of motility assays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information on the relationship between cumulative fossil CO2 emissions and multiple climate targets is essential to design emission mitigation and climate adaptation strategies. In this study, the transient response of a climate or environmental variable per trillion tonnes of CO2 emissions, termed TRE, is quantified for a set of impact-relevant climate variables and from a large set of multi-forcing scenarios extended to year 2300 towards stabilization. An  ∼ 1000-member ensemble of the Bern3D-LPJ carbon–climate model is applied and model outcomes are constrained by 26 physical and biogeochemical observational data sets in a Bayesian, Monte Carlo-type framework. Uncertainties in TRE estimates include both scenario uncertainty and model response uncertainty. Cumulative fossil emissions of 1000 Gt C result in a global mean surface air temperature change of 1.9 °C (68 % confidence interval (c.i.): 1.3 to 2.7 °C), a decrease in surface ocean pH of 0.19 (0.18 to 0.22), and a steric sea level rise of 20 cm (13 to 27 cm until 2300). Linearity between cumulative emissions and transient response is high for pH and reasonably high for surface air and sea surface temperatures, but less pronounced for changes in Atlantic meridional overturning, Southern Ocean and tropical surface water saturation with respect to biogenic structures of calcium carbonate, and carbon stocks in soils. The constrained model ensemble is also applied to determine the response to a pulse-like emission and in idealized CO2-only simulations. The transient climate response is constrained, primarily by long-term ocean heat observations, to 1.7 °C (68 % c.i.: 1.3 to 2.2 °C) and the equilibrium climate sensitivity to 2.9 °C (2.0 to 4.2 °C). This is consistent with results by CMIP5 models but inconsistent with recent studies that relied on short-term air temperature data affected by natural climate variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trabecular bone is a porous mineralized tissue playing a major load bearing role in the human body. Prediction of age-related and disease-related fractures and the behavior of bone implant systems needs a thorough understanding of its structure-mechanical property relationships, which can be obtained using microcomputed tomography-based finite element modeling. In this study, a nonlinear model for trabecular bone as a cohesive-frictional material was implemented in a large-scale computational framework and validated by comparison of μFE simulations with experimental tests in uniaxial tension and compression. A good correspondence of stiffness and yield points between simulations and experiments was found for a wide range of bone volume fraction and degree of anisotropy in both tension and compression using a non-calibrated, average set of material parameters. These results demonstrate the ability of the model to capture the effects leading to failure of bone for three anatomical sites and several donors, which may be used to determine the apparent behavior of trabecular bone and its evolution with age, disease, and treatment in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE The implementation of genomic-based medicine is hindered by unresolved questions regarding data privacy and delivery of interpreted results to health-care practitioners. We used DNA-based prediction of HIV-related outcomes as a model to explore critical issues in clinical genomics. METHODS We genotyped 4,149 markers in HIV-positive individuals. Variants allowed for prediction of 17 traits relevant to HIV medical care, inference of patient ancestry, and imputation of human leukocyte antigen (HLA) types. Genetic data were processed under a privacy-preserving framework using homomorphic encryption, and clinical reports describing potentially actionable results were delivered to health-care providers. RESULTS A total of 230 patients were included in the study. We demonstrated the feasibility of encrypting a large number of genetic markers, inferring patient ancestry, computing monogenic and polygenic trait risks, and reporting results under privacy-preserving conditions. The average execution time of a multimarker test on encrypted data was 865 ms on a standard computer. The proportion of tests returning potentially actionable genetic results ranged from 0 to 54%. CONCLUSIONS The model of implementation presented herein informs on strategies to deliver genomic test results for clinical care. Data encryption to ensure privacy helps to build patient trust, a key requirement on the road to genomic-based medicine.Genet Med advance online publication 14 January 2016Genetics in Medicine (2016); doi:10.1038/gim.2015.167.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synopsis: Sport organisations are facing multiple challenges originating from an increasingly complex and dynamic environment in general, and from internal changes in particular. Our study seeks to reveal and analyse the causes for professionalization processes in international sport federations, the forms resulting from it, as well as related consequences. Abstract: AIM OF ABSTRACT/PAPER - RESEARCH QUESTION Sport organisations are facing multiple challenges originating from an increasingly complex and dynamic environment in general, and from internal changes in particular. In this context, professionalization seems to have been adopted by sport organisations as an appropriate strategy to respond to pressures such as becoming more “business-like”. The ongoing study seeks to reveal and analyse the internal and external causes for professionalization processes in international sport federations, the forms resulting from it (e.g. organisational, managerial, economic) as well as related consequences on objectives, values, governance methods, performance management or again rationalisation. THEORETICAL BACKGROUND/LITERATURE REVIEW Studies on sport as specific non-profit sector mainly focus on the prospect of the “professionalization of individuals” (Thibault, Slack & Hinings, 1991), often within sport clubs (Thiel, Meier & Cachay, 2006) and national sport federations (Seippel, 2002) or on organisational change (Griginov & Sandanski, 2008; Slack & Hinings, 1987, 1992; Slack, 1985, 2001), thus leaving broader analysis on governance, management and professionalization in sport organisations an unaccomplished task. In order to further current research on above-mentioned topics, our intention is to analyse causes, forms and consequences of professionalisation processes in international sport federations. The social theory of action (Coleman, 1986; Esser, 1993) has been defined as appropriate theoretical framework, deriving in the following a multi-level framework for the analysis of sport organisations (Nagel, 2007). In light of the multi-level framework, sport federations are conceptualised as corporative actors whose objectives are defined and implemented with regard to the interests of member organisations (Heinemann, 2004) and/or other pressure groups. In order to understand social acting and social structures (Giddens 1984) of sport federations, two levels are in the focus of our analysis: the macro level examining the environment at large (political, social, economic systems etc.) and the meso level (Esser, 1999) examining organisational structures, actions and decisions of the federation’s headquarter as well as member organisations. METHODOLOGY, RESEARCH DESIGN AND DATA ANALYSIS The multi-level framework mentioned seeks to gather and analyse information on causes, forms and consequences of professionalization processes in sport federations. It is applied in a twofold approach: first an exploratory study based on nine semi-structured interviews with experts from umbrella sport organisations (IOC, WADA, ASOIF, AIOWF, etc.) as well as the analysis of related documents, relevant reports (IOC report 2000 on governance reform, Agenda 2020, etc.) and important moments of change in the Olympic Movement (Olympic revenue share, IOC evaluation criteria, etc.); and secondly several case studies. Whereas the exploratory study seeks more the causes for professionalization on an external, internal and headquarter level as depicted in the literature, the case studies rather focus on forms and consequences. Applying our conceptual framework, the analysis of forms is built around three dimensions: 1) Individuals (persons and positions), 2) Processes, structures (formalisation, specialisation), 3) Activities (strategic planning). With regard to consequences, we centre our attention on expectations of and relationships with stakeholders (e.g. cooperation with business partners), structure, culture and processes (e.g. governance models, performance), and expectations of and relationships with member organisations (e.g. centralisation vs. regionalisation). For the case studies, a mixed-method approach is applied to collect relevant data: questionnaires for rather quantitative data, interviews for rather qualitative data, as well as document and observatory analysis. RESULTS, DISCUSSION AND IMPLICATIONS/CONCLUSIONS With regard to causes of professionalization processes, we analyse the content of three different levels: 1. the external level, where the main pressure derives from financial resources (stakeholders, benefactors) and important turning points (scandals, media pressure, IOC requirements for Olympic sports); 2. the internal level, where pressure from member organisations turned out to be less decisive than assumed (little involvement of member organisations in decision-making); 3. the headquarter level, where specific economic models (World Cups, other international circuits, World Championships), and organisational structures (decision-making procedures, values, leadership) trigger or hinder a federation’s professionalization process. Based on our first analysis, an outline for an economic model is suggested, distinguishing four categories of IFs: “money-generating IFs” being rather based on commercialisation and strategic alliances; “classical Olympic IFs” being rather reactive and dependent on Olympic revenue; “classical non-Olympic IFs” being rather independent of the Olympic Movement; and “money-receiving IFs” being dependent on benefactors and having strong traditions and values. The results regarding forms and consequences will be outlined in the presentation. The first results from the two pilot studies will allow us to refine our conceptual framework for subsequent case studies, thus extending our data collection and developing fundamental conclusions. References: Bayle, E., & Robinson, L. (2007). A framework for understanding the performance of national governing bodies of sport. European Sport Management Quarterly, 7, 249–268 Chantelat, P. (2001). La professionnalisation des organisations sportives: Nouveaux débats, nouveaux enjeux [Professionalisation of sport organisations]. Paris: L’Harmattan. Dowling, M., Edwards, J., & Washington, M. (2014). Understanding the concept of professionalization in sport management research. Sport Management Review. Advance online publication. doi: 10.1016/j.smr.2014.02.003 Ferkins, L. & Shilbury, D. (2012). Good Boards Are Strategic: What Does That Mean for Sport Governance? Journal of Sport Management, 26, 67-80. Thibault, L., Slack, T., & Hinings, B. (1991). Professionalism, structures and systems: The impact of professional staff on voluntary sport organizations. International Review for the Sociology of Sport, 26, 83–97.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding the run-time behavior of software systems can be a challenging activity. Debuggers are an essential category of tools used for this purpose as they give developers direct access to the running systems. Nevertheless, traditional debuggers rely on generic mechanisms to introspect and interact with the running systems, while developers reason about and formulate domain-specific questions using concepts and abstractions from their application domains. This mismatch creates an abstraction gap between the debugging needs and the debugging support leading to an inefficient and error-prone debugging effort, as developers need to recover concrete domain concepts using generic mechanisms. To reduce this gap, and increase the efficiency of the debugging process, we propose a framework for developing domain-specific debuggers, called the Moldable Debugger, that enables debugging at the level of the application domain. The Moldable Debugger is adapted to a domain by creating and combining domain-specific debugging operations with domain-specific debugging views, and adapts itself to a domain by selecting, at run time, appropriate debugging operations and views. To ensure the proposed model has practical applicability (i.e., can be used in practice to build real debuggers), we discuss, from both a performance and usability point of view, three implementation strategies. We further motivate the need for domain-specific debugging, identify a set of key requirements and show how our approach improves debugging by adapting the debugger to several domains.