927 resultados para graphics processor
Resumo:
Sophisticated magnetic resonance tagging techniques provide powerful tools for the non-invasive assessment of the local heartwall motion towards a deeper fundamental understanding of local heart function. For the extraction of motion data from the time series of magnetic resonance tagged images and for the visualization of the local heartwall motion a new image analysis procedure has been developed. New parameters have been derived which allows quantification of the motion patterns and are highly sensitive to any changes in these patterns. The new procedure has been applied for heart motion analysis in healthy volunteers and in patient collectives with different heart diseases. The achieved results are summarized and discussed.
Resumo:
Introduction. Development of the fetal brain surfacewith concomitant gyrification is one of the majormaturational processes of the human brain. Firstdelineated by postmortem studies or by ultrasound, MRIhas recently become a powerful tool for studying in vivothe structural correlates of brain maturation. However,the quantitative measurement of fetal brain developmentis a major challenge because of the movement of the fetusinside the amniotic cavity, the poor spatial resolution,the partial volume effect and the changing appearance ofthe developing brain. Today extensive efforts are made todeal with the âeurooepost-acquisitionâeuro reconstruction ofhigh-resolution 3D fetal volumes based on severalacquisitions with lower resolution (Rousseau, F., 2006;Jiang, S., 2007). We here propose a framework devoted tothe segmentation of the basal ganglia, the gray-whitetissue segmentation, and in turn the 3D corticalreconstruction of the fetal brain. Method. Prenatal MRimaging was performed with a 1-T system (GE MedicalSystems, Milwaukee) using single shot fast spin echo(ssFSE) sequences in fetuses aged from 29 to 32gestational weeks (slice thickness 5.4mm, in planespatial resolution 1.09mm). For each fetus, 6 axialvolumes shifted by 1 mm were acquired (about 1 min pervolume). First, each volume is manually segmented toextract fetal brain from surrounding fetal and maternaltissues. Inhomogeneity intensity correction and linearintensity normalization are then performed. A highspatial resolution image of isotropic voxel size of 1.09mm is created for each fetus as previously published byothers (Rousseau, F., 2006). B-splines are used for thescattered data interpolation (Lee, 1997). Then, basalganglia segmentation is performed on this superreconstructed volume using active contour framework witha Level Set implementation (Bach Cuadra, M., 2010). Oncebasal ganglia are removed from the image, brain tissuesegmentation is performed (Bach Cuadra, M., 2009). Theresulting white matter image is then binarized andfurther given as an input in the Freesurfer software(http://surfer.nmr.mgh.harvard.edu/) to provide accuratethree-dimensional reconstructions of the fetal brain.Results. High-resolution images of the cerebral fetalbrain, as obtained from the low-resolution acquired MRI,are presented for 4 subjects of age ranging from 29 to 32GA. An example is depicted in Figure 1. Accuracy in theautomated basal ganglia segmentation is compared withmanual segmentation using measurement of Dice similarity(DSI), with values above 0.7 considering to be a verygood agreement. In our sample we observed DSI valuesbetween 0.785 and 0.856. We further show the results ofgray-white matter segmentation overlaid on thehigh-resolution gray-scale images. The results arevisually checked for accuracy using the same principlesas commonly accepted in adult neuroimaging. Preliminary3D cortical reconstructions of the fetal brain are shownin Figure 2. Conclusion. We hereby present a completepipeline for the automated extraction of accuratethree-dimensional cortical surface of the fetal brain.These results are preliminary but promising, with theultimate goal to provide âeurooemovieâeuro of the normal gyraldevelopment. In turn, a precise knowledge of the normalfetal brain development will allow the quantification ofsubtle and early but clinically relevant deviations.Moreover, a precise understanding of the gyraldevelopment process may help to build hypotheses tounderstand the pathogenesis of several neurodevelopmentalconditions in which gyrification have been shown to bealtered (e.g. schizophrenia, autismâeuro¦). References.Rousseau, F. (2006), 'Registration-Based Approach forReconstruction of High-Resolution In Utero Fetal MR Brainimages', IEEE Transactions on Medical Imaging, vol. 13,no. 9, pp. 1072-1081. Jiang, S. (2007), 'MRI of MovingSubjects Using Multislice Snapshot Images With VolumeReconstruction (SVR): Application to Fetal, Neonatal, andAdult Brain Studies', IEEE Transactions on MedicalImaging, vol. 26, no. 7, pp. 967-980. Lee, S. (1997),'Scattered data interpolation with multilevel B-splines',IEEE Transactions on Visualization and Computer Graphics,vol. 3, no. 3, pp. 228-244. Bach Cuadra, M. (2010),'Central and Cortical Gray Mater Segmentation of MagneticResonance Images of the Fetal Brain', ISMRM Conference.Bach Cuadra, M. (2009), 'Brain tissue segmentation offetal MR images', MICCAI.
Resumo:
BACKGROUND/AIM: Raloxifene is the first selective estrogen receptor modulator that has been approved for the treatment and prevention of osteoporosis in postmenopausal women in Europe and in the US. Although raloxifene reduces the risk of invasive breast cancer in postmenopausal women with osteoporosis and in postmenopausal women at high risk for invasive breast cancer, it is approved in that indication in the US but not in the EU. The aim was to characterize the clinical profiles of postmenopausal women expected to benefit most from therapy with raloxifene based on published scientific evidence to date. METHODS: Key individual patient characteristics relevant to the prescription of raloxifene in daily practice were defined by a board of Swiss experts in the fields of menopause and metabolic bone diseases and linked to published scientific evidence. Consensus was reached about translating these insights into daily practice. RESULTS: Through estrogen agonistic effects on bone, raloxifene reduces biochemical markers of bone turnover to premenopausal levels, increases bone mineral density (BMD) at the lumbar spine, proximal femur, and total body, and reduces vertebral fracture risk in women with osteopenia or osteoporosis with and without prevalent vertebral fracture. Through estrogen antagonistic effects on breast tissue, raloxifene reduces the risk of invasive estrogen-receptor positive breast cancer in postmenopausal women with osteoporosis and in postmenopausal women at high risk for invasive breast cancer. Finally, raloxifene increases the incidence of hot flushes, the risk of venous thromboembolic events, and the risk of fatal stroke in postmenopausal women at increased risk for coronary heart disease. Postmenopausal women in whom the use of raloxifene is considered can be categorized in a 2 × 2 matrix reflecting their bone status (osteopenic or osteoporotic based on their BMD T-score by dual energy X-ray absorptiometry) and their breast cancer risk (low or high based on the modified Gail model). Women at high risk of breast cancer should be considered for treatment with raloxifene. CONCLUSION: Postmenopausal women between 50 and 70 years of age without climacteric symptoms with either osteopenia or osteoporosis should be evaluated with regard to their breast cancer risk and considered for treatment with raloxifene within the framework of its contraindications and precautions.
Resumo:
Positive pressure ventilation (PPV) is a frequent intervention in the neonatal intensive care unit. This article is directed towards paediatricians in training and attempts to cover the basics of PPV without being too technical. To do so we have employed an extensive use of graphics to illustrate the underlying principles.
Resumo:
Centrifuge is a user-friendly system to simultaneously access Arabidopsis gene annotations and intra- and inter-organism sequence comparison data. The tool allows rapid retrieval of user-selected data for each annotated Arabidopsis gene providing, in any combination, data on the following features: predicted protein properties such as mass, pI, cellular location and transmembrane domains; SWISS-PROT annotations; Interpro domains; Gene Ontology records; verified transcription; BLAST matches to the proteomes of A.thaliana, Oryza sativa (rice), Caenorhabditis elegans, Drosophila melanogaster and Homo sapiens. The tool lends itself particularly well to the rapid analysis of contigs or of tens or hundreds of genes identified by high-throughput gene expression experiments. In these cases, a summary table of principal predicted protein features for all genes is given followed by more detailed reports for each individual gene. Centrifuge can also be used for single gene analysis or in a word search mode. AVAILABILITY: http://centrifuge.unil.ch/ CONTACT: edward.farmer@unil.ch.
Resumo:
Although increasing our knowledge of the properties of networks of cities is essential, these properties can be measured at the city level, and must be assessed by analyzing actor networks. The present volume focuses less on individual characteristics and more on the interactions of actors and institutions that create functional territories in which the structure of existing links constrains emerging links. Rather than basing explanations on external factors, the goal is to determine the extent to which network properties reflect spatial distributions and create local synergies at the meso level that are incorporated into global networks at the macro level where different geographical scales occur. The paper introduces the way to use the graphs structure to identify empirically relevant groups and levels that explain dynamics. It defines what could be called âeurooemulti-levelâeuro, âeurooemulti-scaleâeuro, or âeurooemultidimensionalâeuro networks in the context of urban geography. It explains how the convergence of the network multi-territoriality paradigm collaboratively formulated, and manipulated by geographers and computer scientists produced the SPANGEO project, which is exposed in this volume.
Resumo:
The theory of small-world networks as initiated by Watts and Strogatz (1998) has drawn new insights in spatial analysis as well as systems theory. The theoryâeuro?s concepts and methods are particularly relevant to geography, where spatial interaction is mainstream and where interactions can be described and studied using large numbers of exchanges or similarity matrices. Networks are organized through direct links or by indirect paths, inducing topological proximities that simultaneously involve spatial, social, cultural or organizational dimensions. Network synergies build over similarities and are fed by complementarities between or inside cities, with the two effects potentially amplifying each other according to the âeurooepreferential attachmentâeuro hypothesis that has been explored in a number of different scientific fields (Barabási, Albert 1999; Barabási A-L 2002; Newman M, Watts D, Barabà si A-L). In fact, according to Barabási and Albert (1999), the high level of hierarchy observed in âeurooescale-free networksâeuro results from âeurooepreferential attachmentâeuro, which characterizes the development of networks: new connections appear preferentially close to nodes that already have the largest number of connections because in this way, the improvement in the network accessibility of the new connection will likely be greater. However, at the same time, network regions gathering dense and numerous weak links (Granovetter, 1985) or network entities acting as bridges between several components (Burt 2005) offer a higher capacity for urban communities to benefit from opportunities and create future synergies. Several methodologies have been suggested to identify such denser and more coherent regions (also called communities or clusters) in terms of links (Watts, Strogatz 1998; Watts 1999; Barabási, Albert 1999; Barabási 2002; Auber 2003; Newman 2006). These communities not only possess a high level of dependency among their member entities but also show a low level of âeurooevulnerabilityâeuro, allowing for numerous redundancies (Burt 2000; Burt 2005). The SPANGEO project 2005âeuro"2008 (SPAtial Networks in GEOgraphy), gathering a team of geographers and computer scientists, has included empirical studies to survey concepts and measures developed in other related fields, such as physics, sociology and communication science. The relevancy and potential interpretation of weighted or non-weighted measures on edges and nodes were examined and analyzed at different scales (intra-urban, inter-urban or both). New classification and clustering schemes based on the relative local density of subgraphs were developed. The present article describes how these notions and methods contribute on a conceptual level, in terms of measures, delineations, explanatory analyses and visualization of geographical phenomena.
Resumo:
The SoftPlotter, a soft photogrammetric software and Silicon Graphics workstation, was used to evaluate the accuracy of soft photogrammetry and identify applications of this technology to highway engineering. A comparative study showed that SoftPlotter compares well with other software such as Socket and Integraph. The PC software TNTMips is inexpensive but needs further development to be comparable to SoftPlotter. The Campus Project showed that soft photogrammetry is accurate for traditional photogrammetric applications. It is also accurate for producing orthophoto and base maps for Geographic Information Systems (GISs). The Highway Project showed that soft photogrammetry is accurate for highway engineering and that the technical staff at the Iowa Department of Transportation (IA DOT) can be easily trained in this new technology. The research demonstrated that soft photogrammetry can be used with low-flight helicopter photography for large-scale mapping in highway engineering. The researchers recommend that research be conducted to test the use of digital cameras instead of the traditional aerial cameras in helicopter photography. Research that examines the use of soft photogrammetry with video logging imagery for inventory and GIS studies in highway maintenance is also recommended. Research is also warranted into the integration of soft photogrammetry with virtual reality, which can be used in three-dimensional designing and visualization of highways and subdivisions in real time. The IA DOT owns one analytical plotter and two analogue plotters. The analytical plotter is used for aerial triangulation, and the analogue plotters are used for plotting. However, neither is capable of producing orthophotos. Therefore, the researchers recommend that the IA DOT purchase soft photogrammetric workstations for orthophoto production, and if and when required, use it for aerial triangulation and plotting. In the future, the analogue plotters may become obsolete. At that time, the researchers recommend that the analogue plotters be phased out and replaced by soft photogrammetric workstations.
Resumo:
[spa] Se trata de un estudio sobre el absentismo laboral durante la primera década del S. XXI en España y con comparativas de otros países de nuestro entorno. Se analizan y presentan gráficos de las diferentes contingencias más significativas que inciden en las ausencias al trabajo. Se comentan los factores organizacionales, sociales y políticos que inciden en este fenómeno cuyos costes son elevados. Este estudio argumenta que la falta de motivación en sus multiples aspectos son la clave para que los niveles de absentismo sean elevados. El trabajo relata los distintos medios que son utilizados habitualmente para reducir las ausencias en el trabajo.
Resumo:
The Zeman Barn (86-00028) is an early twentieth-century example of a gothic roofed barn and is part of the Zeman Farmstead located along U.S. Highway 30 in Otter Creek Township (Township 38N, Range 14W), Tama County, Iowa (Figures 1 and 2). The farmstead was initially evaluated in a reconnaissance architectural survey conducted in 1998 by The Louis Berger Group, Inc (Berger). An intensive architectural survey of the property by Berger’s Principal Architectural Historian, Martha H. Bowers, evaluated the farmstead as not being eligible for listing in the National Register of Historic Places (National Register) but noted that the barn appears to be eligible for listing in the National Register under Criterion C (Bowers 1998). At the request of the Iowa Department of Transportation, Berger completed the recordation project to provide a documentary record of the Zeman Barn in accordance with the guidelines set forth by the Iowa State Historic Preservation Office regarding historic property studies for barns. Background research for this project was conducted in September 2008 and April 2009. The property was inspected and photographed in May 2008. Information on the property was gathered through background research, interviews with Zeman family members, field investigation, and photo documentation. Historical maps of the project area were used to collect data necessary for developing regional and local historic contexts. The research for this report was conducted at the Tama County Courthouse and the Tama County Historical Museum Genealogical Library, both in Toledo. Much of the background research for the project was conducted by Camilla Deiber and Michael Dulle. Ms. Deiber also prepared the photographic documentation, plan drawings, and the graphics used in this report. Mr. Roger L. Ciuffo conducted interviews with Zeman family members and wrote this report.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.
Resumo:
Exposure to solar ultraviolet (UV) light is the main causative factor for skin cancer. UV exposure depends on environmental and individual factors. Individual exposure data remain scarce and development of alternative assessment methods is greatly needed. We developed a model simulating human exposure to solar UV. The model predicts the dose and distribution of UV exposure received on the basis of ground irradiation and morphological data. Standard 3D computer graphics techniques were adapted to develop a rendering engine that estimates the solar exposure of a virtual manikin depicted as a triangle mesh surface. The amount of solar energy received by each triangle was calculated, taking into account reflected, direct and diffuse radiation, and shading from other body parts. Dosimetric measurements (n = 54) were conducted in field conditions using a foam manikin as surrogate for an exposed individual. Dosimetric results were compared to the model predictions. The model predicted exposure to solar UV adequately. The symmetric mean absolute percentage error was 13%. Half of the predictions were within 17% range of the measurements. This model provides a tool to assess outdoor occupational and recreational UV exposures, without necessitating time-consuming individual dosimetry, with numerous potential uses in skin cancer prevention and research.
Resumo:
Exposure to solar ultraviolet (UV) radiation is the main causative factor for skin cancer. UV exposure depends on environmental and individual factors, but individual exposure data remain scarce. UV irradiance is monitored via different techniques including ground measurements and satellite observations. However it is difficult to translate such observations into human UV exposure or dose because of confounding factors (shape of the exposed surface, shading, behavior, etc.) A collaboration between public health institutions, a meteorological office and an institute specialized in computing techniques developed a model predicting the dose and distribution of UV exposure on the basis of ground irradiation and morphological data. Standard 3D computer graphics techniques were adapted to develop this tool, which estimates solar exposure of a virtual manikin depicted as a triangle mesh surface. The amount of solar energy received by various body locations is computed for direct, diffuse and reflected radiation separately. The radiation components are deduced from corresponding measurements of UV irradiance, and the related UV dose received by each triangle of the virtual manikin is computed accounting for shading by other body parts and eventual protection measures. The model was verified with dosimetric measurements (n=54) in field conditions using a foam manikin as surrogate for an exposed individual. Dosimetric results were compared to the model predictions. The model predicted exposure to solar UV adequately. The symmetric mean absolute percentage error was 13%. Half of the predictions were within 17% range of the measurements. This model allows assessing outdoor occupational and recreational UV exposures, without necessitating time-consuming individual dosimetry, with numerous potential uses in skin cancer prevention and research. Using this tool, we investigated solar UV exposure patterns with respect to the relative contribution of the direct, diffuse and reflected radiation. We assessed exposure doses for various body parts and exposure scenarios of a standing individual (static and dynamic postures). As input, the model used erythemally-weighted ground irradiance data measured in 2009 at Payerne, Switzerland. A year-round daily exposure (8 am to 5 pm) without protection was assumed. For most anatomical sites, mean daily doses were high (typically 6.2-14.6 SED) and exceeded recommended exposure values. Direct exposure was important during specific periods (e.g. midday during summer), but contributed moderately to the annual dose, ranging from 15 to 24% for vertical and horizontal body parts, respectively. Diffuse irradiation explained about 80% of the cumulative annual exposure dose. Acute diffuse exposures were also obtained for cloudy summer days. The importance of diffuse UV radiation should not be underestimated when advocating preventive measures. Messages focused on avoiding acute direct exposures may be of limited efficiency to prevent skin cancers associated with chronic exposure (e.g., squamous cell carcinomas).
Resumo:
This paper presents SiMR, a simulator of the Rudimentary Machine designed to be used in a first course of computer architecture of Software Engineering and Computer Engineering programmes. The Rudimentary Machine contains all the basic elements in a RISC computer, and SiMR allows editing, assembling and executing programmes for this processor. SiMR is used at the Universitat Oberta de Catalunya as one of the most important resources in the Virtual Computing Architecture and Organisation Laboratory, since students work at home with the simulator and reports containing their work are automatically generated to be evaluated by lecturers. The results obtained from a survey show that most of the students consider SiMR as a highly necessary or even an indispensable resource to learn the basic concepts about computer architecture.