927 resultados para Mining extraction model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Introduction We conducted the present study to investigate whether early large-volume crystalloid infusion can restore gut mucosal blood flow and mesenteric oxygen metabolism in severe sepsis. Methods Anesthetized and mechanically ventilated male mongrel dogs were challenged with intravenous injection of live Escherichia coli (6 × 109 colony-forming units/ml per kg over 15 min). After 90 min they were randomly assigned to one of two groups – control (no fluids; n = 13) or lactated Ringer's solution (32 ml/kg per hour; n = 14) – and followed for 60 min. Cardiac index, mesenteric blood flow, mean arterial pressure, systemic and mesenteric oxygen-derived variables, blood lactate and gastric carbon dioxide tension (PCO2; by gas tonometry) were assessed throughout the study. Results E. coli infusion significantly decreased arterial pressure, cardiac index, mesenteric blood flow, and systemic and mesenteric oxygen delivery, and increased arterial and portal lactate, intramucosal PCO2, PCO2 gap (the difference between gastric mucosal and arterial PCO2), and systemic and mesenteric oxygen extraction ratio in both groups. The Ringer's solution group had significantly higher cardiac index and systemic oxygen delivery, and lower oxygen extraction ratio and PCO2 gap at 165 min as compared with control animals. However, infusion of lactated Ringer's solution was unable to restore the PCO2 gap. There were no significant differences between groups in mesenteric oxygen delivery, oxygen extraction ratio, or portal lactate at the end of study. Conclusion Significant disturbances occur in the systemic and mesenteric beds during bacteremic severe sepsis. Although large-volume infusion of lactated Ringer's solution restored systemic hemodynamic parameters, it was unable to correct gut mucosal PCO2 gap.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Passifloraceae family is extensively used in native Brazilian folk medicine to treat a wide variety of diseases. The problem of flavonoid extraction from Passiflora was treated by application of design of experiments (DOE), as an experiment with mixture including one categorical process variable. The components of the binary mixture were: ethanol (component A) and water (component B); the categorical process variable: extraction method (factor C) was varied at two levels: (+1) maceration and (-1) percolation. ANOVA suggested a cubic model for P. edulis extraction and a quadratic model for P. alata.These results indicate that the proportion of components A and B in the mixture is the main factor involved in significantly increasing flavonoid extraction. In regard to the extraction methods, no important differences were observed, which indicates that these two traditional extraction methods could be effectively used to extract flavonoids from both medicinal plants. The evaluation of antioxidant activity of the extract by ORAC method showed that P. edulis displays twice as much antioxidant activity as P. alata. Considering that maceration is a simple, rapid and environmentally friendly extraction method, in this study, the optimized conditions for flavonoid extraction from these Passiflora species is maceration with 75% ethanol for P. edulis and 50% ethanol for P. alata.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first part of the research project of the Co-Advisorship Ph.D Thesis was aimed to select the best Bifidobacterium longum strains suitable to set the basis of our study. We were looking for strains with the abilities to colonize the intestinal mucosa and with good adhesion capacities, so that we can test these strains to investigate their ability to induce apoptosis in “damaged” intestinal cells. Adhesion and apoptosis are the two process that we want to study to better understand the role of an adhesion protein that we have previously identified and that have top scores homologies with the recent serpin encoding gene identified in B. longum by Nestlè researchers. Bifidobacterium longum is a probiotic, known for its beneficial effects to the human gut and even for its immunomodulatory and antitumor activities. Recently, many studies have stressed out the intimate relation between probiotic bacteria and the GIT mucosa and their influence on human cellular homeostasis. We focused on the apoptotic deletion of cancer cells induced by B. longum. This has been valued in vitro, performing the incubation of three B.longum strains with enterocyte-like Caco- 2 cells, to evidence DNA fragmentation, a cornerstone of apoptosis. The three strains tested were defined for their adhesion properties using adhesion and autoaggregation assays. These features are considered necessary to select a probiotic strain. The three strains named B12, B18 and B2990 resulted respectively: “strong adherent”, “adherent” and “non adherent”. Then, bacteria were incubated with Caco-2 cells to investigate apoptotic deletion. Cocultures of Caco-2 cells with B. longum resulted positive in DNA fragmentation test, only when adherent strains were used (B12 and B18). These results indicate that the interaction with adherent B. longum can induce apoptotic deletion of Caco-2 cells, suggesting a role in cellular homeostasis of the gastrointestinal tract and in restoring the ecology of damaged colon tissues. These results were used to keep on researching and the strains tested were used as recipient of recombinant techniques aimed to originate new B.longum strains with enhanced capacity of apoptotic induction in “damaged” intestinal cells. To achieve this new goal it was decided to clone the serpin encoding gene of B. longum, so that we can understand its role in adhesion and apoptosis induction. Bifidobacterium longum has immunostimulant activity that in vitro can lead to apoptotic response of Caco-2 cell line. It secretes a hypothetical eukaryotic type serpin protein, which could be involved in this kind of deletion of damaged cells. We had previously characterised a protein that has homologies with the hypothetical serpin of B. longum (DD087853). In order to create Bifidobacterium serpin transformants, a B. longum cosmid library was screened with a PCR protocol using specific primers for serpin gene. After fragment extraction, the insert named S1 was sub-cloned into pRM2, an Escherichia coli - Bifidobacterium shuttle vector, to construct pRM3. Several protocols for B. longum transformation were performed and the best efficiency was obtained using MRS medium and raffinose. Finally bacterial cell supernatants were tested in a dotblot assay to detect antigens presence against anti-antitrypsin polyclonal antibody. The best signal was produced by one starin that has been renamed B. longum BLKS 7. Our research study was aimed to generate transformants able to over express serpin encoding gene, so that we can have the tools for a further study on bacterial apoptotic induction of Caco-2 cell line. After that we have originated new trasformants the next step to do was to test transformants abilities when exposed to an intestinal cell model. In fact, this part of the project was achieved in the Department of Biochemistry of the Medical Faculty of the University of Maribor, guest of the abroad supervisor of the Co-Advisorship Doctoral Thesis: Prof. Avrelija Cencic. In this study we examined the probiotic ability of some bacterial strains using intestinal cells from a 6 years old pig. The use of intestinal mammalian cells is essential to study this symbiosis and a functional cell model mimics a polarised epithelium in which enterocytes are separated by tight junctions. In this list of strains we have included the Bifidobacterium longum BKS7 transformant strain that we have previously originated; in order to compare its abilities. B. longum B12 wild type and B. longum BKS7 transformant and eight Lactobacillus strains of different sources were co-cultured with porcine small intestine epithelial cells (PSI C1) and porcine blood monocytes (PoM2) in Transwell filter inserts. The strains, including Lb. gasseri, Lb. fermentum, Lb. reuterii, Lb. plantarum and unidentified Lactobacillus from kenyan maasai milk and tanzanian coffee, were assayed for activation of cell lines, measuring nitric oxide by Griess reaction, H202 by tetramethylbenzidine reaction and O2 - by cytochrome C reduction. Cytotoxic effect by crystal violet staining and induction on metabolic activity by MTT cell proliferation assay were tested too. Transepithelial electrical resistance (TER) of polarised PSI C1 was measured during 48 hours co-culture. TER, used to observe epithelium permeability, decrease during pathogenesis and tissue becomes permeable to ion passive flow lowering epithelial barrier function. Probiotics can prevent or restore increased permeability. Lastly, dot-blot was achieved against Interleukin-6 of treated cells supernatants. The metabolic activity of PoM2 and PSI C1 increased slightly after co-culture not affecting mitochondrial functions. No strain was cytotoxic over PSI C1 and PoM2 and no cell activation was observed, as measured by the release of NO2, H202 and O2 - by PoM2 and PSI C1. During coculture TER of polarised PSI C1 was two-fold higher comparing with constant TER (~3000 ) of untreated cells. TER raise generated by bacteria maintains a low permeability of the epithelium. During treatment Interleukin-6 was detected in cell supernatants at several time points, confirming immunostimulant activity. All results were obtained using Lactobacillus paracasei Shirota e Carnobacterium divergens as controls. In conclusion we can state that both the list of putative probiotic bacteria and our new transformant strain of B. longum are not harmful when exposed to intestinal cells and could be selected as probiotics, because can strengthen epithelial barrier function and stimulate nonspecific immunity of intestinal cells on a pig cell model. Indeed, we have found out that none of the strains tested that have good adhesion abilities presents citotoxicity to the intestinal cells and that non of the strains tested can induce cell lines to produce high level of ROS, neither NO2. Moreover we have assayed even the capacity of producing certain citokynes that are correlated with immune response. The detection of Interleukin-6 was assayed in all our samples, including B.longum transformant BKS 7 strain, this result indicates that these bacteria can induce a non specific immune response in the intestinal cells. In fact, when we assayed the presence of Interferon-gamma in cells supernatant after bacterial exposure, we have no positive signals, that means that there is no activation of a specific immune response, thus confirming that these bacteria are not recognize as pathogen by the intestinal cells and are certainly not harmful for intestinal cells. The most important result is the measure of Trans Epithelial Electric Resistance that have shown how the intestinal barrier function get strengthen when cells are exposed to bacteria, due to a reduction of the epithelium permeability. We have now a new strain of B. longum that will be used for further studies above the mechanism of apoptotic induction to “damaged cells” and above the process of “restoring ecology”. This strain will be the basis to originate new transformant strains for Serpin encoding gene that must have better performance and shall be used one day even in clinical cases as in “gene therapy” for cancer treatment and prevention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Salt deposits characterize the subsurface of Tuzla (BiH) and made it famous since the ancient times. Archeological discoveries demonstrate the presence of a Neolithic pile-dwelling settlement related to the existence of saltwater springs that contributed to make the most of the area a swampy ground. Since the Roman times, the town is reported as “the City of Salt deposits and Springs”; "tuz" is the Turkish word for salt, as the Ottomans renamed the settlement in the 15th century following their conquest of the medieval Bosnia (Donia and Fine, 1994). Natural brine springs were located everywhere and salt has been evaporated by means of hot charcoals since pre-Roman times. The ancient use of salt was just a small exploitation compared to the massive salt production carried out during the 20th century by means of classical mine methodologies and especially wild brine pumping. In the past salt extraction was practised tapping natural brine springs, while the modern technique consists in about 100 boreholes with pumps tapped to the natural underground brine runs, at an average depth of 400-500 m. The mining operation changed the hydrogeological conditions enabling the downward flow of fresh water causing additional salt dissolution. This process induced severe ground subsidence during the last 60 years reaching up to 10 meters of sinking in the most affected area. Stress and strain of the overlying rocks induced the formation of numerous fractures over a conspicuous area (3 Km2). Consequently serious damages occurred to buildings and infrastructures such as water supply system, sewage networks and power lines. Downtown urban life was compromised by the destruction of more than 2000 buildings that collapsed or needed to be demolished causing the resettlement of about 15000 inhabitants (Tatić, 1979). Recently salt extraction activities have been strongly reduced, but the underground water system is returning to his natural conditions, threatening the flooding of the most collapsed area. During the last 60 years local government developed a monitoring system of the phenomenon, collecting several data about geodetic measurements, amount of brine pumped, piezometry, lithostratigraphy, extension of the salt body and geotechnical parameters. A database was created within a scientific cooperation between the municipality of Tuzla and the city of Rotterdam (D.O.O. Mining Institute Tuzla, 2000). The scientific investigation presented in this dissertation has been financially supported by a cooperation project between the Municipality of Tuzla, The University of Bologna (CIRSA) and the Province of Ravenna. The University of Tuzla (RGGF) gave an important scientific support in particular about the geological and hydrogeological features. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas (Gutierrez et al., 2008). The subject of this study is the collapsing phenomenon occurring in Tuzla area with the aim to identify and quantify the several factors involved in the system and their correlations. Tuzla subsidence phenomenon can be defined as geohazard, which represents the consequence of an adverse combination of geological processes and ground conditions precipitated by human activity with the potential to cause harm (Rosenbaum and Culshaw, 2003). Where an hazard induces a risk to a vulnerable element, a risk management process is required. The single factors involved in the subsidence of Tuzla can be considered as hazards. The final objective of this dissertation represents a preliminary risk assessment procedure and guidelines, developed in order to quantify the buildings vulnerability in relation to the overall geohazard that affect the town. The historical available database, never fully processed, have been analyzed by means of geographic information systems and mathematical interpolators (PART I). Modern geomatic applications have been implemented to deeply investigate the most relevant hazards (PART II). In order to monitor and quantify the actual subsidence rates, geodetic GPS technologies have been implemented and 4 survey campaigns have been carried out once a year. Subsidence related fractures system has been identified by means of field surveys and mathematical interpretations of the sinking surface, called curvature analysis. The comparison of mapped and predicted fractures leaded to a better comprehension of the problem. Results confirmed the reliability of fractures identification using curvature analysis applied to sinking data instead of topographic or seismic data. Urban changes evolution has been reconstructed analyzing topographic maps and satellite imageries, identifying the most damaged areas. This part of the investigation was very important for the quantification of buildings vulnerability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Except the article forming the main content most HTML documents on the WWW contain additional contents such as navigation menus, design elements or commercial banners. In the context of several applications it is necessary to draw the distinction between main and additional content automatically. Content extraction and template detection are the two approaches to solve this task. This thesis gives an extensive overview of existing algorithms from both areas. It contributes an objective way to measure and evaluate the performance of content extraction algorithms under different aspects. These evaluation measures allow to draw the first objective comparison of existing extraction solutions. The newly introduced content code blurring algorithm overcomes several drawbacks of previous approaches and proves to be the best content extraction algorithm at the moment. An analysis of methods to cluster web documents according to their underlying templates is the third major contribution of this thesis. In combination with a localised crawling process this clustering analysis can be used to automatically create sets of training documents for template detection algorithms. As the whole process can be automated it allows to perform template detection on a single document, thereby combining the advantages of single and multi document algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis analyses problems related to the applicability, in business environments, of Process Mining tools and techniques. The first contribution is a presentation of the state of the art of Process Mining and a characterization of companies, in terms of their "process awareness". The work continues identifying circumstance where problems can emerge: data preparation; actual mining; and results interpretation. Other problems are the configuration of parameters by not-expert users and computational complexity. We concentrate on two possible scenarios: "batch" and "on-line" Process Mining. Concerning the batch Process Mining, we first investigated the data preparation problem and we proposed a solution for the identification of the "case-ids" whenever this field is not explicitly indicated. After that, we concentrated on problems at mining time and we propose the generalization of a well-known control-flow discovery algorithm in order to exploit non instantaneous events. The usage of interval-based recording leads to an important improvement of performance. Later on, we report our work on the parameters configuration for not-expert users. We present two approaches to select the "best" parameters configuration: one is completely autonomous; the other requires human interaction to navigate a hierarchy of candidate models. Concerning the data interpretation and results evaluation, we propose two metrics: a model-to-model and a model-to-log. Finally, we present an automatic approach for the extension of a control-flow model with social information, in order to simplify the analysis of these perspectives. The second part of this thesis deals with control-flow discovery algorithms in on-line settings. We propose a formal definition of the problem, and two baseline approaches. The actual mining algorithms proposed are two: the first is the adaptation, to the control-flow discovery problem, of a frequency counting algorithm; the second constitutes a framework of models which can be used for different kinds of streams (stationary versus evolving).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Autism Spectrum Disorders (ASDs) describe a set of neurodevelopmental disorders. ASD represents a significant public health problem. Currently, ASDs are not diagnosed before the 2nd year of life but an early identification of ASDs would be crucial as interventions are much more effective than specific therapies starting in later childhood. To this aim, cheap an contact-less automatic approaches recently aroused great clinical interest. Among them, the cry and the movements of the newborn, both involving the central nervous system, are proposed as possible indicators of neurological disorders. This PhD work is a first step towards solving this challenging problem. An integrated system is presented enabling the recording of audio (crying) and video (movements) data of the newborn, their automatic analysis with innovative techniques for the extraction of clinically relevant parameters and their classification with data mining techniques. New robust algorithms were developed for the selection of the voiced parts of the cry signal, the estimation of acoustic parameters based on the wavelet transform and the analysis of the infant’s general movements (GMs) through a new body model for segmentation and 2D reconstruction. In addition to a thorough literature review this thesis presents the state of the art on these topics that shows that no studies exist concerning normative ranges for newborn infant cry in the first 6 months of life nor the correlation between cry and movements. Through the new automatic methods a population of control infants (“low-risk”, LR) was compared to a group of “high-risk” (HR) infants, i.e. siblings of children already diagnosed with ASD. A subset of LR infants clinically diagnosed as newborns with Typical Development (TD) and one affected by ASD were compared. The results show that the selected acoustic parameters allow good differentiation between the two groups. This result provides new perspectives both diagnostic and therapeutic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Granular matter, also known as bulk solids, consists of discrete particles with sizes between micrometers and meters. They are present in many industrial applications as well as daily life, like in food processing, pharmaceutics or in the oil and mining industry. When handling granular matter the bulk solids are stored, mixed, conveyed or filtered. These techniques are based on observations in macroscopic experiments, i.e. rheological examinations of the bulk properties. Despite the amply investigations of bulk mechanics, the relation between single particle motion and macroscopic behavior is still not well understood. For exploring the microscopic properties on a single particle level, 3D imaging techniques are required.rnThe objective of this work was the investigation of single particle motions in a bulk system in 3D under an external mechanical load, i.e. compression and shear. During the mechanical load the structural and dynamical properties of these systems were examined with confocal microscopy. Therefor new granular model systems in the wet and dry state were designed and prepared. As the particles are solid bodies, their motion is described by six degrees of freedom. To explore their entire motion with all degrees of freedom, a technique to visualize the rotation of spherical micrometer sized particles in 3D was developed. rnOne of the foci during this dissertation was a model system for dry cohesive granular matter. In such systems the particle motion during a compression of the granular matter was investigated. In general the rotation of single particles was the more sensitive parameter compared to the translation. In regions with large structural changes the rotation had an earlier onset than the translation. In granular systems under shear, shear dilatation and shear zone formation were observed. Globally the granular sediments showed a shear behavior, which was known already from classical shear experiments, for example with Jenike cells. Locally the shear zone formation was enhanced, when near the applied load a pre-diluted region existed. In regions with constant volume fraction a mixing between the different particle layers occurred. In particular an exchange of particles between the current flowing region and the non-flowing region was observed. rnThe second focus was on model systems for wet granular matter, where an additional binding liquid is added to the particle suspension. To examine the 3D structure of the binding liquid on the micrometer scale independently from the particles, a second illumination and detection beam path was implemented. In shear and compression experiments of wet clusters and bulk systems completely different dynamics compared to dry cohesive models systems occured. In a Pickering emulsion-like system large structural changes predominantly occurred in the local environment of binding liquid droplets. These large local structural changes were due to an energy interplay between the energy stored in the binding droplet during its deformation and the binding energy of particles at the droplet interface. rnConfocal microscopy in combination with nanoindentation gave new insights into the single particle motions and dynamics of granular systems under a mechanical load. These novel experimental results can help to improve the understanding of the relationship between bulk properties of granular matter, such as volume fraction or yield stress and the dynamics on a single particle level.rnrn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: In protein sequence classification, identification of the sequence motifs or n-grams that can precisely discriminate between classes is a more interesting scientific question than the classification itself. A number of classification methods aim at accurate classification but fail to explain which sequence features indeed contribute to the accuracy. We hypothesize that sequences in lower denominations (n-grams) can be used to explore the sequence landscape and to identify class-specific motifs that discriminate between classes during classification. Discriminative n-grams are short peptide sequences that are highly frequent in one class but are either minimally present or absent in other classes. In this study, we present a new substitution-based scoring function for identifying discriminative n-grams that are highly specific to a class. Results: We present a scoring function based on discriminative n-grams that can effectively discriminate between classes. The scoring function, initially, harvests the entire set of 4- to 8-grams from the protein sequences of different classes in the dataset. Similar n-grams of the same size are combined to form new n-grams, where the similarity is defined by positive amino acid substitution scores in the BLOSUM62 matrix. Substitution has resulted in a large increase in the number of discriminatory n-grams harvested. Due to the unbalanced nature of the dataset, the frequencies of the n-grams are normalized using a dampening factor, which gives more weightage to the n-grams that appear in fewer classes and vice-versa. After the n-grams are normalized, the scoring function identifies discriminative 4- to 8-grams for each class that are frequent enough to be above a selection threshold. By mapping these discriminative n-grams back to the protein sequences, we obtained contiguous n-grams that represent short class-specific motifs in protein sequences. Our method fared well compared to an existing motif finding method known as Wordspy. We have validated our enriched set of class-specific motifs against the functionally important motifs obtained from the NLSdb, Prosite and ELM databases. We demonstrate that this method is very generic; thus can be widely applied to detect class-specific motifs in many protein sequence classification tasks. Conclusion: The proposed scoring function and methodology is able to identify class-specific motifs using discriminative n-grams derived from the protein sequences. The implementation of amino acid substitution scores for similarity detection, and the dampening factor to normalize the unbalanced datasets have significant effect on the performance of the scoring function. Our multipronged validation tests demonstrate that this method can detect class-specific motifs from a wide variety of protein sequence classes with a potential application to detecting proteome-specific motifs of different organisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2009, the International Commission on Radiological Protection issued a statement on radon which stated that the dose conversion factor for radon progeny would likely double, and the calculation of risk from radon should move to a dosimetric approach, rather than the longstanding epidemiological approach. Through the World Nuclear Association, whose members represent over 90% of the world's uranium production, industry has been examining this issue with a goal of offering expertise and knowledge to assist with the practical implementation of these evolutionary changes to evaluating the risk from radon progeny. Industry supports the continuing use of the most current epidemiological data as a basis for risk calculation, but believes that further examination of these results is needed to better understand the level of conservatism in the potential epidemiological-based risk models. With regard to adoption of the dosimetric approach, industry believes that further work is needed before this is a practical option. In particular, this work should include a clear demonstration of the validation of the dosimetric model which includes how smoking is handled, the establishment of a practical measurement protocol, and the collection of relevant data for modern workplaces. Industry is actively working to address the latter two items.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently developed computer applications provide tools for planning cranio-maxillofacial interventions based on 3-dimensional (3D) virtual models of the patient's skull obtained from computed-tomography (CT) scans. Precise knowledge of the location of the mid-facial plane is important for the assessment of deformities and for planning reconstructive procedures. In this work, a new method is presented to automatically compute the mid-facial plane on the basis of a surface model of the facial skeleton obtained from CT. The method matches homologous surface areas selected by the user on the left and right facial side using an iterative closest point optimization. The symmetry plane which best approximates this matching transformation is then computed. This new automatic method was evaluated in an experimental study. The study included experienced and inexperienced clinicians defining the symmetry plane by a selection of landmarks. This manual definition was systematically compared with the definition resulting from the new automatic method: Quality of the symmetry planes was evaluated by their ability to match homologous areas of the face. Results show that the new automatic method is reliable and leads to significantly higher accuracy than the manual method when performed by inexperienced clinicians. In addition, the method performs equally well in difficult trauma situations, where key landmarks are unreliable or absent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software repositories have been getting a lot of attention from researchers in recent years. In order to analyze software repositories, it is necessary to first extract raw data from the version control and problem tracking systems. This poses two challenges: (1) extraction requires a non-trivial effort, and (2) the results depend on the heuristics used during extraction. These challenges burden researchers that are new to the community and make it difficult to benchmark software repository mining since it is almost impossible to reproduce experiments done by another team. In this paper we present the TA-RE corpus. TA-RE collects extracted data from software repositories in order to build a collection of projects that will simplify extraction process. Additionally the collection can be used for benchmarking. As the first step we propose an exchange language capable of making sharing and reusing data as simple as possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic identification and extraction of bone contours from X-ray images is an essential first step task for further medical image analysis. In this paper we propose a 3D statistical model based framework for the proximal femur contour extraction from calibrated X-ray images. The automatic initialization is solved by an estimation of Bayesian network algorithm to fit a multiple component geometrical model to the X-ray data. The contour extraction is accomplished by a non-rigid 2D/3D registration between a 3D statistical model and the X-ray images, in which bone contours are extracted by a graphical model based Bayesian inference. Preliminary experiments on clinical data sets verified its validity

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aggregates were historically a low cost commodity but with communities and governmental agencies reducing the amount of mining the cost is increasing dramatically. An awareness needs to be brought to communities that aggregate production is necessary for ensuring the existing infrastructure in today’s world. This can be accomplished using proven technologies in other areas and applying them to show how viable reclamation is feasible. A proposed mine reclamation, Douglas Township quarry (DTQ), in Dakota Township, MN was evaluated using Visual Hydrologic Evaluation of Landfill Performance (HELP) model. The HELP is commonly employed for estimating the water budget of a landfill, however, it was applied to determine the water budget of the DTQ following mining. Using an environmental impact statement as the case study, modeling predictions indicated the DTQ will adequately drain the water being put into the system. The height of the groundwater table will rise slightly due to the mining excavations but no ponding will occur. The application of HELP model determined the water budget of the DTQ and can be used as a viable option for mining companies to demonstrate how land can be reclaimed following mining operations.