959 resultados para Finite Volume Methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a validation study on statistical nonsupervised brain tissue classification techniques in magnetic resonance (MR) images. Several image models assuming different hypotheses regarding the intensity distribution model, the spatial model and the number of classes are assessed. The methods are tested on simulated data for which the classification ground truth is known. Different noise and intensity nonuniformities are added to simulate real imaging conditions. No enhancement of the image quality is considered either before or during the classification process. This way, the accuracy of the methods and their robustness against image artifacts are tested. Classification is also performed on real data where a quantitative validation compares the methods' results with an estimated ground truth from manual segmentations by experts. Validity of the various classification methods in the labeling of the image as well as in the tissue volume is estimated with different local and global measures. Results demonstrate that methods relying on both intensity and spatial information are more robust to noise and field inhomogeneities. We also demonstrate that partial volume is not perfectly modeled, even though methods that account for mixture classes outperform methods that only consider pure Gaussian classes. Finally, we show that simulated data results can also be extended to real data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: Key factors of Fast Track (FT) programs are fluid restriction and epidural analgesia (EDA). We aimed to challenge the preconception that the combination of fluid restriction and EDA might induce hypotension and renal dysfunction. METHODS: A recent randomized trial (NCT00556790) showed reduced complications after colectomy in FT patients compared with standard care (SC). Patients with an effective EDA were compared with regard to hemodynamics and renal function. RESULTS: 61/76 FT patients and 59/75 patients in the SC group had an effective EDA. Both groups were comparable regarding demographics and surgery-related characteristics. FT patients received significantly less i.v. fluids intraoperatively (1900 mL [range 1100-4100] versus 2900 mL [1600-5900], P < 0.0001) and postoperatively (700 mL [400-1500] versus 2300 mL [1800-3800], P < 0.0001). Intraoperatively, 30 FT compared with 19 SC patients needed colloids or vasopressors, but this was statistically not significant (P = 0.066). Postoperative requirements were low in both groups (3 versus 5 patients; P = 0.487). Pre- and postoperative values for creatinine, hematocrit, sodium, and potassium were similar, and no patient developed renal dysfunction in either group. Only one of 82 patients having an EDA without a bladder catheter had urinary retention. Overall, FT patients had fewer postoperative complications (6 versus 20 patients; P = 0.002) and a shorter median hospital stay (5 [2-30] versus 9 d [6-30]; P< 0.0001) compared with the SC group. CONCLUSIONS: Fluid restriction and EDA in FT programs are not associated with clinically relevant hemodynamic instability or renal dysfunction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES:: For certain major operations, inpatient mortality risk is lower in high-volume hospitals than those in low-volume hospitals. Extending the analysis to a broader range of interventions and outcomes is necessary before adopting policies based on minimum volume thresholds. METHODS:: Using the United States 2004 Nationwide Inpatient Sample, we assessed the effect of intervention-specific and overall hospital volume on surgical complications, potentially avoidable reoperations, and deaths across 1.4 million interventions in 353 hospitals. Outcome variations across hospitals were analyzed through a 3-level hierarchical logistic regression model (patients, surgical interventions, and hospitals), which took into account interventions on multiple organs, 144 intervention categories, and structural hospital characteristics. Discriminative performance and calibration were good. RESULTS:: Hospitals with more experience in a given intervention had similar reoperation rates but lower mortality and complication rates: odds ratio per volume deciles 0.93 and 0.97. However, the benefit was limited to heart surgery and a small number of other operations. Risks were higher for hospitals that performed more interventions overall: odds ratio per 1000 for each event was approximately 1.02. Even after adjustment for specific volume, mortality varied substantially across both high- and low-volume hospitals. CONCLUSION:: Although the link between specific volume and certain inpatient outcomes suggests that specialization might help improve surgical safety, the variable magnitude of this link and the heterogeneity of hospital effect do not support the systematic use of volume-based referrals. It may be more efficient to monitor risk-adjusted postoperative outcomes and to investigate facilities with worse than expected outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advancements in high-throughput technologies to measure increasingly complex biological phenomena at the genomic level are rapidly changing the face of biological research from the single-gene single-protein experimental approach to studying the behavior of a gene in the context of the entire genome (and proteome). This shift in research methodologies has resulted in a new field of network biology that deals with modeling cellular behavior in terms of network structures such as signaling pathways and gene regulatory networks. In these networks, different biological entities such as genes, proteins, and metabolites interact with each other, giving rise to a dynamical system. Even though there exists a mature field of dynamical systems theory to model such network structures, some technical challenges are unique to biology such as the inability to measure precise kinetic information on gene-gene or gene-protein interactions and the need to model increasingly large networks comprising thousands of nodes. These challenges have renewed interest in developing new computational techniques for modeling complex biological systems. This chapter presents a modeling framework based on Boolean algebra and finite-state machines that are reminiscent of the approach used for digital circuit synthesis and simulation in the field of very-large-scale integration (VLSI). The proposed formalism enables a common mathematical framework to develop computational techniques for modeling different aspects of the regulatory networks such as steady-state behavior, stochasticity, and gene perturbation experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To prospectively evaluate the accuracy and reliability of "freehand" posttraumatic orbital wall reconstruction with AO (Arbeitsgemeinschaft Osteosynthese) titanium mesh plates by using computer-aided volumetric measurement of the bony orbits. METHODS: Bony orbital volume was measured in 12 patients from coronal CT scan slices using OsiriX Medical Image software. After defining the volumetric limits of the orbit, the segmentation of the bony orbital region of interest of each single slice was performed. At the end of the segmentation process, all regions of interest were grouped and the volume was computed. The same procedure was performed on both orbits, and thereafter the volume of the contralateral uninjured orbit was used as a control for comparison. RESULTS: In all patients, the volume data of the reconstructed orbit fitted that of the contralateral uninjured orbit with accuracy to within 1.85 cm3 (7%). CONCLUSIONS: This preliminary study has demonstrated that posttraumatic orbital wall reconstruction using "freehand" bending and placement of AO titanium mesh plates results in a high success rate in re-establishing preoperative bony volume, which closely approximates that of the contralateral uninjured orbit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Työn tarkoituksena oli testata jo tutkimuskeskuksella käytössä ollutta ja tutkimuskeskukselle tässä työssä kehitettyä pakkauksen vesihöyrytiiveyteen liittyvää mittausmenetelmää. Saatuja tuloksia verrattiin keskenään sekä materiaalista mitattuihin arvoihin. Elintarvikepakkauksia tutkittiin myös kosteussensoreiden, säilyvyyskokeen sekä kuljetussimuloinnin avulla. Optimoinnilla tutkittiin pakkauksen muodon vaikutusta vesihöyrytiiveyteen. Pakkauksen vesihöyrynläpäisyn mittaamiseen kehitetty menetelmä toimi hyvin ja sen toistettavuus oli hyvä. Verrattaessa sitä jo olemassa olleeseen menetelmään tulokseksi saatiin, että uusi menetelmä oli nopeampi ja vaati vähemmän työaikaa, mutta molemmat menetelmät antoivat hyviä arvoja rinnakkaisille näytteille. Kosteussensoreilla voitiin tutkia tyhjän pakkauksen sisällä olevan kosteuden muutoksia säilytyksen aikana. Säilyvyystesti tehtiin muroilla ja parhaan vesihöyrysuojan antoivat pakkaukset joissa oli alumiinilaminaatti- tai metalloitu OPP kerros. Kuljetustestauksen ensimmäisessä testissä pakkauksiin pakattiin muroja ja toisessa testissä nuudeleita. Kuljetussimuloinnilla ei ollutvaikutusta pakkausten sisäpintojen eheyteen eikä siten pakkausten vesihöyrytiiveyteen. Optimoinnilla vertailtiin eri muotoisten pakkausten tilavuus/pinta-ala suhdetta ja vesihöyrytiiveyden riippuvuutta pinta-alasta. Optimaalisimmaksi pakkaukseksi saatiin pallo, jonka pinta-ala oli pienin ja materiaalin sallima vesihöyrynläpäisy suurin ja vesihöyrybarrierin määrä pienin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Reading volume and mammography screening performance appear positively correlated. Quality and effectiveness were compared across low-volume screening programmes targeting relatively small populations and operating under the same decentralised healthcare system. Except for accreditation of 2nd readers (restrictive vs non-restrictive strategy), these organised programmes had similar screening regimen/procedures and duration, which maximises comparability. Variation in performance and its determinants were explored in order to improve mammography practice and optimise screening performance. METHODS: Circa 200,000 screens performed between 1999 and 2006 (4 rounds) in 3 longest standing Swiss cantonal programmes (of Vaud, Geneva and Valais) were assessed. Indicators of quality and effectiveness were assessed according to European standards. Interval cancers were identified through linkage with cancer registries records. RESULTS: Swiss programmes met most European standards of performance with a substantial, favourable cancer stage shift. Up to a two-fold variation occurred for several performance indicators. In subsequent rounds, compared with programmes (Vaud and Geneva) that applied a restrictive selection strategy for 2nd readers, proportions of in situ lesions and of small cancers (≤1cm) were one third lower and halved, respectively, and the proportion of advanced lesions (stage II+) nearly 50% higher in the programme without a restrictive selection strategy. Discrepancy in second-year proportional incidence of interval cancers appears to be multicausal. CONCLUSION: Differences in performance could partly be explained by a selective strategy for second readers and a prior experience in service screening, but not by the levels of opportunistic screening and programme attendance. This study provides clues for enhancing mammography screening performance in low-volume programmes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Depuis le séminaire H. Cartan de 1954-55, il est bien connu que l'on peut trouver des éléments de torsion arbitrairement grande dans l'homologie entière des espaces d'Eilenberg-MacLane K(G,n) où G est un groupe abélien non trivial et n>1. L'objectif majeur de ce travail est d'étendre ce résultat à des H-espaces possédant plus d'un groupe d'homotopie non trivial. Dans le but de contrôler précisément le résultat de H. Cartan, on commence par étudier la dualité entre l'homologie et la cohomologie des espaces d'Eilenberg-MacLane 2-locaux de type fini. On parvient ainsi à raffiner quelques résultats qui découlent des calculs de H. Cartan. Le résultat principal de ce travail peut être formulé comme suit. Soit X un H-espace ne possédant que deux groupes d'homotopie non triviaux, tous deux finis et de 2-torsion. Alors X n'admet pas d'exposant pour son groupe gradué d'homologie entière réduite. On construit une large classe d'espaces pour laquelle ce résultat n'est qu'une conséquence d'une caractéristique topologique, à savoir l'existence d'un rétract faible X K(G,n) pour un certain groupe abélien G et n>1. On généralise également notre résultat principal à des espaces plus compliqués en utilisant la suite spectrale d'Eilenberg-Moore ainsi que des méthodes analytiques faisant apparaître les nombres de Betti et leur comportement asymptotique. Finalement, on conjecture que les espaces qui ne possédent qu'un nombre fini de groupes d'homotopie non triviaux n'admettent pas d'exposant homologique. Ce travail contient par ailleurs la présentation de la « machine d'Eilenberg-MacLane », un programme C++ conçu pour calculer explicitement les groupes d'homologie entière des espaces d'Eilenberg-MacLane. <br/><br/>By the work of H. Cartan, it is well known that one can find elements of arbitrarilly high torsion in the integral (co)homology groups of an Eilenberg-MacLane space K(G,n), where G is a non-trivial abelian group and n>1. The main goal of this work is to extend this result to H-spaces having more than one non-trivial homotopy groups. In order to have an accurate hold on H. Cartan's result, we start by studying the duality between homology and cohomology of 2-local Eilenberg-MacLane spaces of finite type. This leads us to some improvements of H. Cartan's methods in this particular case. Our main result can be stated as follows. Let X be an H-space with two non-vanishing finite 2-torsion homotopy groups. Then X does not admit any exponent for its reduced integral graded (co)homology group. We construct a wide class of examples for which this result is a simple consequence of a topological feature, namely the existence of a weak retract X K(G,n) for some abelian group G and n>1. We also generalize our main result to more complicated stable two stage Postnikov systems, using the Eilenberg-Moore spectral sequence and analytic methods involving Betti numbers and their asymptotic behaviour. Finally, we investigate some guesses on the non-existence of homology exponents for finite Postnikov towers. We conjecture that Postnikov pieces do not admit any (co)homology exponent. This work also includes the presentation of the "Eilenberg-MacLane machine", a C++ program designed to compute explicitely all integral homology groups of Eilenberg-MacLane spaces. <br/><br/>Il est toujours difficile pour un mathématicien de parler de son travail. La difficulté réside dans le fait que les objets qu'il étudie sont abstraits. On rencontre assez rarement un espace vectoriel, une catégorie abélienne ou une transformée de Laplace au coin de la rue ! Cependant, même si les objets mathématiques sont difficiles à cerner pour un non-mathématicien, les méthodes pour les étudier sont essentiellement les mêmes que celles utilisées dans les autres disciplines scientifiques. On décortique les objets complexes en composantes plus simples à étudier. On dresse la liste des propriétés des objets mathématiques, puis on les classe en formant des familles d'objets partageant un caractère commun. On cherche des façons différentes, mais équivalentes, de formuler un problème. Etc. Mon travail concerne le domaine mathématique de la topologie algébrique. Le but ultime de cette discipline est de parvenir à classifier tous les espaces topologiques en faisant usage de l'algèbre. Cette activité est comparable à celle d'un ornithologue (topologue) qui étudierait les oiseaux (les espaces topologiques) par exemple à l'aide de jumelles (l'algèbre). S'il voit un oiseau de petite taille, arboricole, chanteur et bâtisseur de nids, pourvu de pattes à quatre doigts, dont trois en avant et un, muni d'une forte griffe, en arrière, alors il en déduira à coup sûr que c'est un passereau. Il lui restera encore à déterminer si c'est un moineau, un merle ou un rossignol. Considérons ci-dessous quelques exemples d'espaces topologiques: a) un cube creux, b) une sphère et c) un tore creux (c.-à-d. une chambre à air). a) b) c) Si toute personne normalement constituée perçoit ici trois figures différentes, le topologue, lui, n'en voit que deux ! De son point de vue, le cube et la sphère ne sont pas différents puisque ils sont homéomorphes: on peut transformer l'un en l'autre de façon continue (il suffirait de souffler dans le cube pour obtenir la sphère). Par contre, la sphère et le tore ne sont pas homéomorphes: triturez la sphère de toutes les façons (sans la déchirer), jamais vous n'obtiendrez le tore. Il existe un infinité d'espaces topologiques et, contrairement à ce que l'on serait naïvement tenté de croire, déterminer si deux d'entre eux sont homéomorphes est très difficile en général. Pour essayer de résoudre ce problème, les topologues ont eu l'idée de faire intervenir l'algèbre dans leurs raisonnements. Ce fut la naissance de la théorie de l'homotopie. Il s'agit, suivant une recette bien particulière, d'associer à tout espace topologique une infinité de ce que les algébristes appellent des groupes. Les groupes ainsi obtenus sont appelés groupes d'homotopie de l'espace topologique. Les mathématiciens ont commencé par montrer que deux espaces topologiques qui sont homéomorphes (par exemple le cube et la sphère) ont les même groupes d'homotopie. On parle alors d'invariants (les groupes d'homotopie sont bien invariants relativement à des espaces topologiques qui sont homéomorphes). Par conséquent, deux espaces topologiques qui n'ont pas les mêmes groupes d'homotopie ne peuvent en aucun cas être homéomorphes. C'est là un excellent moyen de classer les espaces topologiques (pensez à l'ornithologue qui observe les pattes des oiseaux pour déterminer s'il a affaire à un passereau ou non). Mon travail porte sur les espaces topologiques qui n'ont qu'un nombre fini de groupes d'homotopie non nuls. De tels espaces sont appelés des tours de Postnikov finies. On y étudie leurs groupes de cohomologie entière, une autre famille d'invariants, à l'instar des groupes d'homotopie. On mesure d'une certaine manière la taille d'un groupe de cohomologie à l'aide de la notion d'exposant; ainsi, un groupe de cohomologie possédant un exposant est relativement petit. L'un des résultats principaux de ce travail porte sur une étude de la taille des groupes de cohomologie des tours de Postnikov finies. Il s'agit du théorème suivant: un H-espace topologique 1-connexe 2-local et de type fini qui ne possède qu'un ou deux groupes d'homotopie non nuls n'a pas d'exposant pour son groupe gradué de cohomologie entière réduite. S'il fallait interpréter qualitativement ce résultat, on pourrait dire que plus un espace est petit du point de vue de la cohomologie (c.-à-d. s'il possède un exposant cohomologique), plus il est intéressant du point de vue de l'homotopie (c.-à-d. il aura plus de deux groupes d'homotopie non nuls). Il ressort de mon travail que de tels espaces sont très intéressants dans le sens où ils peuvent avoir une infinité de groupes d'homotopie non nuls. Jean-Pierre Serre, médaillé Fields en 1954, a montré que toutes les sphères de dimension >1 ont une infinité de groupes d'homotopie non nuls. Des espaces avec un exposant cohomologique aux sphères, il n'y a qu'un pas à franchir...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is commonly observed that complex fabricated structures subject tofatigue loading fail at the welded joints. Some problems can be corrected by proper detail design but fatigue performance can also be improved using post-weld improvement methods. In general, improvement methods can be divided into two main groups: weld geometry modification methods and residual stress modification methods. The former remove weld toe defects and/or reduce the stress concentrationwhile the latter introduce compressive stress fields in the area where fatigue cracks are likely to initiate. Ultrasonic impact treatment (UIT) is a novel post-weld treatment method that influences both the residual stress distribution andimproves the local geometry of the weld. The structural fatigue strength of non-load carrying attachments in the as-welded condition has been experimentally compared to the structural fatigue strength of ultrasonic impact treated welds. Longitudinal attachment specimens made of two thicknesses of steel S355 J0 have been tested for determining the efficiency of ultrasonic impacttreatment. Treated welds were found to have about 50% greater structural fatigue strength, when the slope of the S-N-curve is three. High mean stress fatigue testing based on the Ohta-method decreased the degree of weld improvement only 19%. This indicated that the method could be also applied for large fabricated structures operating under high reactive residual stresses equilibrated within the volume of the structure. The thickness of specimens has no significant effect tothe structural fatigue strength. The fatigue class difference between 5 mm and 8 mm specimen was only 8%. It was hypothesized that the UIT method added a significant crack initiation period to the total fatigue life of the welded joints. Crack initiation life was estimated by a local strain approach. Material parameters were defined using a modified Uniform Material Law developed in Germany. Finite element analysis and X-ray diffraction were used to define, respectively, the stress concentration and mean stress. The theoretical fatigue life was found to have good accuracy comparing to experimental fatigue tests.The predictive behaviour of the local strain approach combined with the uniformmaterial law was excellent for the joint types and conditions studied in this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Canopy characterization is a key factor to improve pesticide application methods in tree crops and vineyards. Development of quick, easy and efficient methods to determine the fundamental parameters used to characterize canopy structure is thus an important need. In this research the use of ultrasonic and LIDAR sensors have been compared with the traditional manual and destructive canopy measurement procedure. For both methods the values of key parameters such as crop height, crop width, crop volume or leaf area have been compared. Obtained results indicate that an ultrasonic sensor is an appropriate tool to determine the average canopy characteristics, while a LIDAR sensor provides more accuracy and detailed information about the canopy. Good correlations have been obtained between crop volume (CVU) values measured with ultrasonic sensors and leaf area index, LAI (R2 = 0.51). A good correlation has also been obtained between the canopy volume measured with ultrasonic and LIDAR sensors (R2 = 0.52). Laser measurements of crop height (CHL) allow one to accurately predict the canopy volume. The proposed new technologies seems very appropriate as complementary tools to improve the efficiency of pesticide applications, although further improvements are still needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Left atrial (LA) dilatation is associated with a large variety of cardiac diseases. Current cardiovascular magnetic resonance (CMR) strategies to measure LA volumes are based on multi-breath-hold multi-slice acquisitions, which are time-consuming and susceptible to misregistration. AIM: To develop a time-efficient single breath-hold 3D CMR acquisition and reconstruction method to precisely measure LA volumes and function. METHODS: A highly accelerated compressed-sensing multi-slice cine sequence (CS-cineCMR) was combined with a non-model-based 3D reconstruction method to measure LA volumes with high temporal and spatial resolution during a single breath-hold. This approach was validated in LA phantoms of different shapes and applied in 3 patients. In addition, the influence of slice orientations on accuracy was evaluated in the LA phantoms for the new approach in comparison with a conventional model-based biplane area-length reconstruction. As a reference in patients, a self-navigated high-resolution whole-heart 3D dataset (3D-HR-CMR) was acquired during mid-diastole to yield accurate LA volumes. RESULTS: Phantom studies. LA volumes were accurately measured by CS-cineCMR with a mean difference of -4.73 ± 1.75 ml (-8.67 ± 3.54%, r2 = 0.94). For the new method the calculated volumes were not significantly different when different orientations of the CS-cineCMR slices were applied to cover the LA phantoms. Long-axis "aligned" vs "not aligned" with the phantom long-axis yielded similar differences vs the reference volume (-4.87 ± 1.73 ml vs. -4.45 ± 1.97 ml, p = 0.67) and short-axis "perpendicular" vs. "not-perpendicular" with the LA long-axis (-4.72 ± 1.66 ml vs. -4.75 ± 2.13 ml; p = 0.98). The conventional bi-plane area-length method was susceptible for slice orientations (p = 0.0085 for the interaction of "slice orientation" and "reconstruction technique", 2-way ANOVA for repeated measures). To use the 3D-HR-CMR as the reference for LA volumes in patients, it was validated in the LA phantoms (mean difference: -1.37 ± 1.35 ml, -2.38 ± 2.44%, r2 = 0.97). Patient study: The CS-cineCMR LA volumes of the mid-diastolic frame matched closely with the reference LA volume (measured by 3D-HR-CMR) with a difference of -2.66 ± 6.5 ml (3.0% underestimation; true LA volumes: 63 ml, 62 ml, and 395 ml). Finally, a high intra- and inter-observer agreement for maximal and minimal LA volume measurement is also shown. CONCLUSIONS: The proposed method combines a highly accelerated single-breathhold compressed-sensing multi-slice CMR technique with a non-model-based 3D reconstruction to accurately and reproducibly measure LA volumes and function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämä työ käsittelee puutukkien tilavuuden mittaamista värikonenäön avulla. Värikuvat on saatu Simpeleellä olevan metsäteollisuusyrityksen hiomosta. Työssä esitetään perusteellisesti matemaattinen teoria, joka liittyy käytettyihin kuvankäsittelymenetelmiin, kuten luokitteluun, kohinan poistoon ja tukkien segmentointiin. Esitetyt menetelmät implementointiin käytännössä ja eri menetelmillä saatuja tuloksia vertailtiin keskenään. Kuvankäsittelyalgoritmit on implementoitu Matlab 6.0:n avulla. Pääasiassa käytettiin uusinta Image Processing Toolboxia, joka on versio 3.0. Tämä työn näkökulma on pääasiassa käytäntöön soveltava, koska metsäteollsuus on korkealla tasolla Suomessa ja siellä on paljon alan yrityksiä, joissa tässä työssä kehitettyä menetelmää voidaan hyödyntää.