991 resultados para Segmentation methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation‑based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi‑resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Among the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, have the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical‑based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advancements in high-throughput technologies to measure increasingly complex biological phenomena at the genomic level are rapidly changing the face of biological research from the single-gene single-protein experimental approach to studying the behavior of a gene in the context of the entire genome (and proteome). This shift in research methodologies has resulted in a new field of network biology that deals with modeling cellular behavior in terms of network structures such as signaling pathways and gene regulatory networks. In these networks, different biological entities such as genes, proteins, and metabolites interact with each other, giving rise to a dynamical system. Even though there exists a mature field of dynamical systems theory to model such network structures, some technical challenges are unique to biology such as the inability to measure precise kinetic information on gene-gene or gene-protein interactions and the need to model increasingly large networks comprising thousands of nodes. These challenges have renewed interest in developing new computational techniques for modeling complex biological systems. This chapter presents a modeling framework based on Boolean algebra and finite-state machines that are reminiscent of the approach used for digital circuit synthesis and simulation in the field of very-large-scale integration (VLSI). The proposed formalism enables a common mathematical framework to develop computational techniques for modeling different aspects of the regulatory networks such as steady-state behavior, stochasticity, and gene perturbation experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Health professionals and policymakers aspire to make healthcare decisions based on the entire relevant research evidence. This, however, can rarely be achieved because a considerable amount of research findings are not published, especially in case of 'negative' results - a phenomenon widely recognized as publication bias. Different methods of detecting, quantifying and adjusting for publication bias in meta-analyses have been described in the literature, such as graphical approaches and formal statistical tests to detect publication bias, and statistical approaches to modify effect sizes to adjust a pooled estimate when the presence of publication bias is suspected. An up-to-date systematic review of the existing methods is lacking. METHODS/DESIGN: The objectives of this systematic review are as follows:âeuro¢ To systematically review methodological articles which focus on non-publication of studies and to describe methods of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses.âeuro¢ To appraise strengths and weaknesses of methods, the resources they require, and the conditions under which the method could be used, based on findings of included studies.We will systematically search Web of Science, Medline, and the Cochrane Library for methodological articles that describe at least one method of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses. A dedicated data extraction form is developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article. As this will be a qualitative systematic review, data reporting will involve a descriptive summary. DISCUSSION: Results are expected to be publicly available in mid 2013. This systematic review together with the results of other systematic reviews of the OPEN project (To Overcome Failure to Publish Negative Findings) will serve as a basis for the development of future policies and guidelines regarding the assessment and handling of publication bias in meta-analyses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present in this paper the results of the application of several visual methods on a group of locations, dated between VI and I centuries BC, of the ager Tarraconensis (Tarragona, Spain) a Hinterland of the roman colony of Tarraco. The difficulty in interpreting the diverse results in a combined way has been resolved by means of the use of statistical methods, such as Principal Components Analysis (PCA) and K-means clustering analysis. These methods have allowed us to carry out site classifications in function of the landscape's visual structure that contains them and of the visual relationships that could be given among them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New methods and devices for pursuing performance enhancement through altitude training were developed in Scandinavia and the USA in the early 1990s. At present, several forms of hypoxic training and/or altitude exposure exist: traditional 'live high-train high' (LHTH), contemporary 'live high-train low' (LHTL), intermittent hypoxic exposure during rest (IHE) and intermittent hypoxic exposure during continuous session (IHT). Although substantial differences exist between these methods of hypoxic training and/or exposure, all have the same goal: to induce an improvement in athletic performance at sea level. They are also used for preparation for competition at altitude and/or for the acclimatization of mountaineers. The underlying mechanisms behind the effects of hypoxic training are widely debated. Although the popular view is that altitude training may lead to an increase in haematological capacity, this may not be the main, or the only, factor involved in the improvement of performance. Other central (such as ventilatory, haemodynamic or neural adaptation) or peripheral (such as muscle buffering capacity or economy) factors play an important role. LHTL was shown to be an efficient method. The optimal altitude for living high has been defined as being 2200-2500 m to provide an optimal erythropoietic effect and up to 3100 m for non-haematological parameters. The optimal duration at altitude appears to be 4 weeks for inducing accelerated erythropoiesis whereas <3 weeks (i.e. 18 days) are long enough for beneficial changes in economy, muscle buffering capacity, the hypoxic ventilatory response or Na(+)/K(+)-ATPase activity. One critical point is the daily dose of altitude. A natural altitude of 2500 m for 20-22 h/day (in fact, travelling down to the valley only for training) appears sufficient to increase erythropoiesis and improve sea-level performance. 'Longer is better' as regards haematological changes since additional benefits have been shown as hypoxic exposure increases beyond 16 h/day. The minimum daily dose for stimulating erythropoiesis seems to be 12 h/day. For non-haematological changes, the implementation of a much shorter duration of exposure seems possible. Athletes could take advantage of IHT, which seems more beneficial than IHE in performance enhancement. The intensity of hypoxic exercise might play a role on adaptations at the molecular level in skeletal muscle tissue. There is clear evidence that intense exercise at high altitude stimulates to a greater extent muscle adaptations for both aerobic and anaerobic exercises and limits the decrease in power. So although IHT induces no increase in VO(2max) due to the low 'altitude dose', improvement in athletic performance is likely to happen with high-intensity exercise (i.e. above the ventilatory threshold) due to an increase in mitochondrial efficiency and pH/lactate regulation. We propose a new combination of hypoxic method (which we suggest naming Living High-Training Low and High, interspersed; LHTLHi) combining LHTL (five nights at 3000 m and two nights at sea level) with training at sea level except for a few (2.3 per week) IHT sessions of supra-threshold training. This review also provides a rationale on how to combine the different hypoxic methods and suggests advances in both their implementation and their periodization during the yearly training programme of athletes competing in endurance, glycolytic or intermittent sports.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a method to automatically segment red blood cells (RBCs) visualized by digital holographic microscopy (DHM), which is based on the marker-controlled watershed algorithm. Quantitative phase images of RBCs can be obtained by using off-axis DHM along to provide some important information about each RBC, including size, shape, volume, hemoglobin content, etc. The most important process of segmentation based on marker-controlled watershed is to perform an accurate localization of internal and external markers. Here, we first obtain the binary image via Otsu algorithm. Then, we apply morphological operations to the binary image to get the internal markers. We then apply the distance transform algorithm combined with the watershed algorithm to generate external markers based on internal markers. Finally, combining the internal and external markers, we modify the original gradient image and apply the watershed algorithm. By appropriately identifying the internal and external markers, the problems of oversegmentation and undersegmentation are avoided. Furthermore, the internal and external parts of the RBCs phase image can also be segmented by using the marker-controlled watershed combined with our method, which can identify the internal and external markers appropriately. Our experimental results show that the proposed method achieves good performance in terms of segmenting RBCs and could thus be helpful when combined with an automated classification of RBCs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this bachelor's thesis was to chart scientific research articles to present contributing factors to medication errors done by nurses in a hospital setting, and introduce methods to prevent medication errors. Additionally, international and Finnish research was combined and findings were reflected in relation to the Finnish health care system. Literature review was conducted out of 23 scientific articles. Data was searched systematically from CINAHL, MEDIC and MEDLINE databases, and also manually. Literature was analysed and the findings combined using inductive content analysis. Findings revealed that both organisational and individual factors contributed to medication errors. High workload, communication breakdowns, unsuitable working environment, distractions and interruptions, and similar medication products were identified as organisational factors. Individual factors included nurses' inability to follow protocol, inadequate knowledge of medications and personal qualities of the nurse. Developing and improving the physical environment, error reporting, and medication management protocols were emphasised as methods to prevent medication errors. Investing to the staff's competence and well-being was also identified as a prevention method. The number of Finnish articles was small, and therefore the applicability of the findings to Finland is difficult to assess. However, the findings seem to fit to the Finnish health care system relatively well. Further research is needed to identify those factors that contribute to medication errors in Finland. This is a necessity for the development of methods to prevent medication errors that fit in to the Finnish health care system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this work was to evaluate the protective effect of different forms of insecticide application on the transmission of yellow dwarf disease in barley cultivars, as well as to determine the production costs and the net profit of these managements. The experiments were carried out during 2011 and 2012 growing seasons, using the following managements at main plots: T1, seed treatment with insecticide (ST) + insecticide on shoots at 15-day interval; T2, just ST; T3, insecticide applied on shoots, when aphid control level (CL) was reached; T4, without insecticide; and T5, ST + insecticide on shoots when CL was reached. Different barley cultivars - BRS Cauê, BRS Brau and MN 6021 - were arranged in the subplots. Insecticides lambda cyhalothrin (pyrethroid) and thiamethoxam (neonicotinoid) were used. There were differences on yellow dwarf disease index in both seasons for the different treatments, while damage to grain yield was influenced by year and aphid population. Production costs and net profit were different among treatments. Seed treatment with insecticide is sufficient to reduce the transmission of yellow dwarf disease in years with low aphid population pressure, while in years with larger populations, the application of insecticide on shoots is also required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diplomityön tavoitteena oli tarkastella numeerisen virtauslaskennan avulla virtaukseen liittyviä ilmiöitä ja kaasun dispersiota. Diplomityön sisältö on jaettu viiteen osaan; johdantoon, teoriaan, katsaukseen virtauksen mallinnukseen huokoisessa materiaalissa liittyviin tutkimusselvityksiin, numeeriseen mallinnukseen sekä tulosten esittämiseen ja johtopäätöksiin. Diplomityön alussa kiinnitettiin huomiota erilaisiin kokeellisiin, numeerisiin ja teoreettisiin mallinnusmenetelmiin, joilla voidaan mallintaa virtausta huokoisessa materiaalissa. Kirjallisuusosassa tehtiin katsaus aikaisemmin julkaistuihin puoliempiirisiin ja empiirisiin tutkimusselvityksiin, jotka liittyvät huokoisen materiaalin aiheuttamaan painehäviöön. Numeerisessa virtauslaskenta osassa rakennettiin ja esitettiin huokoista materiaalia kuvaavat numeeriset mallit käyttäen kaupallista FLUENT -ohjelmistoa. Työn lopussa arvioitiin teorian, numeerisen virtauslaskennan ja kokeellisten tutkimusselvitysten tuloksia. Kolmiulotteisen huokoisen materiaalinnumeerisessa mallinnuksesta saadut tulokset vaikuttivat lupaavilta. Näiden tulosten perusteella tehtiin suosituksia ajatellen tulevaa virtauksen mallinnusta huokoisessa materiaalissa. Osa tässä diplomityössä esitetyistä tuloksista tullaan esittämään 55. Kanadan Kemiantekniikan konferenssissa Torontossa 1619 Lokakuussa 2005. ASME :n kansainvälisessä tekniikan alan julkaisussa. Työ on hyväksytty esitettäväksi esitettäväksi laskennallisen virtausmekaniikan (CFD) aihealueessa 'Peruskäsitteet'. Lisäksi työn yksityiskohtaiset tulokset tullaan lähettämään myös CES:n julkaisuun.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work is devoted to the problem of reconstructing the basis weight structure at paper web with black{box techniques. The data that is analyzed comes from a real paper machine and is collected by an o®-line scanner. The principal mathematical tool used in this work is Autoregressive Moving Average (ARMA) modelling. When coupled with the Discrete Fourier Transform (DFT), it gives a very flexible and interesting tool for analyzing properties of the paper web. Both ARMA and DFT are independently used to represent the given signal in a simplified version of our algorithm, but the final goal is to combine the two together. Ljung-Box Q-statistic lack-of-fit test combined with the Root Mean Squared Error coefficient gives a tool to separate significant signals from noise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This master's thesis coversthe concepts of knowledge discovery, data mining and technology forecasting methods in telecommunications. It covers the various aspects of knowledge discoveryin data bases and discusses in detail the methods of data mining and technologyforecasting methods that are used in telecommunications. Main concern in the overall process of this thesis is to emphasize the methods that are being used in technology forecasting for telecommunications and data mining. It tries to answer to some extent to the question of do forecasts create a future? It also describes few difficulties that arise in technology forecasting. This thesis was done as part of my master's studies in Lappeenranta University of Technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In vivo fetal magnetic resonance imaging provides aunique approach for the study of early human braindevelopment [1]. In utero cerebral morphometry couldpotentially be used as a marker of the cerebralmaturation and help to distinguish between normal andabnormal development in ambiguous situations. However,this quantitative approach is a major challenge becauseof the movement of the fetus inside the amniotic cavity,the poor spatial resolution provided by very fast MRIsequences and the partial volume effect. Extensiveefforts are made to deal with the reconstruction ofhigh-resolution 3D fetal volumes based on severalacquisitions with lower resolution [2,3,4]. Frameworkswere developed for the segmentation of specific regionsof the fetal brain such as posterior fossa, brainstem orgerminal matrix [5,6], or for the entire brain tissue[7,8], applying the Expectation-Maximization MarkovRandom Field (EM-MRF) framework. However, many of theseprevious works focused on the young fetus (i.e. before 24weeks) and use anatomical atlas priors to segment thedifferent tissue or regions. As most of the gyraldevelopment takes place after the 24th week, acomprehensive and clinically meaningful study of thefetal brain should not dismiss the third trimester ofgestation. To cope with the rapidly changing appearanceof the developing brain, some authors proposed a dynamicatlas [8]. To our opinion, this approach however faces arisk of circularity: each brain will be analyzed /deformed using the template of its biological age,potentially biasing the effective developmental delay.Here, we expand our previous work [9] to proposepost-processing pipeline without prior that allow acomprehensive set of morphometric measurement devoted toclinical application. Data set & Methods: Prenatal MRimaging was performed with a 1-T system (GE MedicalSystems, Milwaukee) using single shot fast spin echo(ssFSE) sequences (TR 7000 ms, TE 180 ms, FOV 40 x 40 cm,slice thickness 5.4mm, in plane spatial resolution1.09mm). For each fetus, 6 axial volumes shifted by 1 mmwere acquired under motherâeuro?s sedation (about 1min pervolume). First, each volume is segmentedsemi-automatically using region-growing algorithms toextract fetal brain from surrounding maternal tissues.Inhomogeneity intensity correction [10] and linearintensity normalization are then performed. Brain tissues(CSF, GM and WM) are then segmented based on thelow-resolution volumes as presented in [9]. Ahigh-resolution image with isotropic voxel size of 1.09mm is created as proposed in [2] and using B-splines forthe scattered data interpolation [11]. Basal gangliasegmentation is performed using a levet setimplementation on the high-resolution volume [12]. Theresulting white matter image is then binarized and givenas an input in FreeSurfer software(http://surfer.nmr.mgh.harvard.edu) to providetopologically accurate three-dimensional reconstructionsof the fetal brain according to the local intensitygradient. References: [1] Guibaud, Prenatal Diagnosis29(4) (2009). [2] Rousseau, Acad. Rad. 13(9), 2006. [3]Jiang, IEEE TMI 2007. [4] Warfield IADB, MICCAI 2009. [5]Claude, IEEE Trans. Bio. Eng. 51(4) 2004. [6] Habas,MICCAI 2008. [7] Bertelsen, ISMRM 2009. [8] Habas,Neuroimage 53(2) 2010. [9] Bach Cuadra, IADB, MICCAI2009. [10] Styner, IEEE TMI 19(39 (2000). [11] Lee, IEEETrans. Visual. And Comp. Graph. 3(3), 1997. [12] BachCuadra, ISMRM 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To prospectively evaluate the accuracy and reliability of "freehand" posttraumatic orbital wall reconstruction with AO (Arbeitsgemeinschaft Osteosynthese) titanium mesh plates by using computer-aided volumetric measurement of the bony orbits. METHODS: Bony orbital volume was measured in 12 patients from coronal CT scan slices using OsiriX Medical Image software. After defining the volumetric limits of the orbit, the segmentation of the bony orbital region of interest of each single slice was performed. At the end of the segmentation process, all regions of interest were grouped and the volume was computed. The same procedure was performed on both orbits, and thereafter the volume of the contralateral uninjured orbit was used as a control for comparison. RESULTS: In all patients, the volume data of the reconstructed orbit fitted that of the contralateral uninjured orbit with accuracy to within 1.85 cm3 (7%). CONCLUSIONS: This preliminary study has demonstrated that posttraumatic orbital wall reconstruction using "freehand" bending and placement of AO titanium mesh plates results in a high success rate in re-establishing preoperative bony volume, which closely approximates that of the contralateral uninjured orbit.