977 resultados para Label-free techniques
Resumo:
Forest cover of the Maringá municipality, located in northern Parana State, was mapped in this study. Mapping was carried out by using high-resolution HRC sensor imagery and medium resolution CCD sensor imagery from the CBERS satellite. Images were georeferenced and forest vegetation patches (TOFs - trees outside forests) were classified using two methods of digital classification: reflectance-based or the digital number of each pixel, and object-oriented. The areas of each polygon were calculated, which allowed each polygon to be segregated into size classes. Thematic maps were built from the resulting polygon size classes and summary statistics generated from each size class for each area. It was found that most forest fragments in Maringá were smaller than 500 m². There was also a difference of 58.44% in the amount of vegetation between the high-resolution imagery and medium resolution imagery due to the distinct spatial resolution of the sensors. It was concluded that high-resolution geotechnology is essential to provide reliable information on urban greens and forest cover under highly human-perturbed landscapes.
Resumo:
Pectus excavatum is the most common deformity of the thorax and usually comprises Computed Tomography (CT) examination for pre-operative diagnosis. Aiming at the elimination of the high amounts of CT radiation exposure, this work presents a new methodology for the replacement of CT by a laser scanner (radiation-free) in the treatment of pectus excavatum using personally modeled prosthesis. The complete elimination of CT involves the determination of ribs external outline, at the maximum sternum depression point for prosthesis placement, based on chest wall skin surface information, acquired by a laser scanner. The developed solution resorts to artificial neural networks trained with data vectors from 165 patients. Scaled Conjugate Gradient, Levenberg-Marquardt, Resilient Back propagation and One Step Secant gradient learning algorithms were used. The training procedure was performed using the soft tissue thicknesses, determined using image processing techniques that automatically segment the skin and rib cage. The developed solution was then used to determine the ribs outline in data from 20 patient scanners. Tests revealed that ribs position can be estimated with an average error of about 6.82±5.7 mm for the left and right side of the patient. Such an error range is well below current prosthesis manual modeling (11.7±4.01 mm) even without CT imagiology, indicating a considerable step forward towards CT replacement by a 3D scanner for prosthesis personalization.
Resumo:
Background: Several studies link the seamless fit of implant-supported prosthesis with the accuracy of the dental impression technique obtained during acquisition. In addition, factors such as implant angulation and coping shape contribute to implant misfit. Purpose: The aim of this study was to identify the most accurate impression technique and factors affecting the impression accuracy. Material and Methods: A systematic review of peer-reviewed literature was conducted analyzing articles published between 2009 and 2013. The following search terms were used: implant impression, impression accuracy, and implant misfit.A total of 417 articles were identified; 32 were selected for review. Results: All 32 selected studies refer to in vitro studies. Fourteen articles compare open and closed impression technique, 8 advocate the open technique, and 6 report similar results. Other 14 articles evaluate splinted and non-splinted techniques; all advocating the splinted technique. Polyether material usage was reported in nine; six studies tested vinyl polysiloxane and one study used irreversible hydrocolloid. Eight studies evaluated different copings designs. Intraoral optical devices were compared in four studies. Conclusions: The most accurate results were achieved with two configurations: (1) the optical intraoral system with powder and (2) the open technique with splinted squared transfer copings, using polyether as impression material.
Resumo:
Micronuclei (MN) in exfoliated epithelial cells are widely used as biomarkers of cancer risk in humans. MN are classified as biomarkers of the break age and loss of chromosomes. They are small, extra nuclear bodies that arise in dividing cells from centric chromosome/chromatid fragments or whole chromosomes/chromatids that lag behind in anaphase and are not included in the daughter nuclei in telophase. Buccal mucosa cells have been used in biomonitoring exposed populations because these cells are in the direct route of exposure to ingested pollutant, are capable of metabolizing proximate carcinogens to reactive chemicals, and are easily and rapidly collected by brushing the buccal mucosa. The objective of the present study was to further investigate if, and to what extent, different stains have an effect on the results of micronuclei studies in exfoliated cells. These techniques are: Papanicolaou (PAP), Modified Papanicolaou, May-Grünwald Giemsa (MGG), Giemsa, Harris’s Hematoxylin, Feulgen with Fast Green counterstain and Feulgen without counterstain.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
We characterize the elastic contribution to the surface free energy of a nematic liquid crystal in the presence of a sawtooth substrate. Our findings are based on numerical minimization of the Landau-de Gennes model and analytical calculations on the Frank-Oseen theory. The nucleation of disclination lines (characterized by non-half-integer winding numbers) in the wedges and apexes of the substrate induces a leading order proportional to q ln q to the elastic contribution to the surface free-energy density, with q being the wave number associated with the substrate periodicity.
Resumo:
A organização automática de mensagens de correio electrónico é um desafio actual na área da aprendizagem automática. O número excessivo de mensagens afecta cada vez mais utilizadores, especialmente os que usam o correio electrónico como ferramenta de comunicação e trabalho. Esta tese aborda o problema da organização automática de mensagens de correio electrónico propondo uma solução que tem como objectivo a etiquetagem automática de mensagens. A etiquetagem automática é feita com recurso às pastas de correio electrónico anteriormente criadas pelos utilizadores, tratando-as como etiquetas, e à sugestão de múltiplas etiquetas para cada mensagem (top-N). São estudadas várias técnicas de aprendizagem e os vários campos que compõe uma mensagem de correio electrónico são analisados de forma a determinar a sua adequação como elementos de classificação. O foco deste trabalho recai sobre os campos textuais (o assunto e o corpo das mensagens), estudando-se diferentes formas de representação, selecção de características e algoritmos de classificação. É ainda efectuada a avaliação dos campos de participantes através de algoritmos de classificação que os representam usando o modelo vectorial ou como um grafo. Os vários campos são combinados para classificação utilizando a técnica de combinação de classificadores Votação por Maioria. Os testes são efectuados com um subconjunto de mensagens de correio electrónico da Enron e um conjunto de dados privados disponibilizados pelo Institute for Systems and Technologies of Information, Control and Communication (INSTICC). Estes conjuntos são analisados de forma a perceber as características dos dados. A avaliação do sistema é realizada através da percentagem de acerto dos classificadores. Os resultados obtidos apresentam melhorias significativas em comparação com os trabalhos relacionados.
Resumo:
Os défices do ciclo da ureia são um grupo de doenças hereditárias do metabolismo caracterizadas fundamentalmente por uma acumulação de amónia. Clinicamente o espectro é muito alargado, com formas de apresentação no período neonatal até situações mais moderadas de apresentação tardia em adultos. O tratamento é fundamentalmente de base nutricional e traduz-se numa redução significativa da mortalidade e morbilidade. Com a introdução da espectrometria de massa nos laboratórios de rastreio neonatal em meados dos anos 90, passou a ser possível quantificar alguns intermediários do ciclo, o que associado à existência de um intervalo livre e um tratamento eficaz, permitiu o rastreio de algumas das doenças deste grupo. Em 2004 iniciou-se em Portugal o rastreio dos défices do ciclo da ureia, tendo-se rastreado até ao presente, 988 687 recém-nascidos e identificado 19 casos positivos. Recentes desenvolvimentos técnicos vieram possibilitar a quantificação de novos marcadores, mais concretamente do ácido orótico, o que abre a possibilidade de rastrear o défice em ornitina transcarbamilase, o défice do ciclo da ureia mais frequente. Os autores apresentam a situação atual do rastreio dos défices do ciclo da ureia e as perspetivas em virtude dos novos desenvolvimentos técnicos.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.
Resumo:
Recent literature has proved that many classical pricing models (Black and Scholes, Heston, etc.) and risk measures (V aR, CV aR, etc.) may lead to “pathological meaningless situations”, since traders can build sequences of portfolios whose risk leveltends to −infinity and whose expected return tends to +infinity, i.e., (risk = −infinity, return = +infinity). Such a sequence of strategies may be called “good deal”. This paper focuses on the risk measures V aR and CV aR and analyzes this caveat in a discrete time complete pricing model. Under quite general conditions the explicit expression of a good deal is given, and its sensitivity with respect to some possible measurement errors is provided too. We point out that a critical property is the absence of short sales. In such a case we first construct a “shadow riskless asset” (SRA) without short sales and then the good deal is given by borrowing more and more money so as to invest in the SRA. It is also shown that the SRA is interested by itself, even if there are short selling restrictions.