14 resultados para artifacts
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
The aim of this study was to determine whether image artifacts caused by orthodontic metal accessories interfere with the accuracy of 3D CBCT model superimposition. A human dry skull was subjected three times to a CBCT scan: at first without orthodontic brackets (T1), then with stainless steel brackets bonded without (T2) and with orthodontic arch wires (T3) inserted into the brackets' slots. The registration of image surfaces and the superimposition of 3D models were performed. Within-subject surface distances between T1-T2, T1-T3 and T2-T3 were computed and calculated for comparison among the three data sets. The minimum and maximum Hausdorff Distance units (HDu) computed between the corresponding data points of the T1 and T2 CBCT 3D surface images were 0.000000 and 0.049280 HDu, respectively, and the mean distance was 0.002497 HDu. The minimum and maximum Hausdorff Distances between T1 and T3 were 0.000000 and 0.047440 HDu, respectively, with a mean distance of 0.002585 HDu. In the comparison between T2 and T3, the minimum, maximum and mean Hausdorff Distances were 0.000000, 0.025616 and 0.000347 HDu, respectively. In the current study, the image artifacts caused by metal orthodontic accessories did not compromise the accuracy of the 3D model superimposition. Color-coded maps of overlaid structures complemented the computed Hausdorff Distances and demonstrated a precise fusion between the data sets.
Resumo:
This work assessed homogeneity of the Institute of Astronomy, Geophysics and Atmospheric Sciences (IAG) weather station climate series, using various statistical techniques. The record from this target station is one of the longest in Brazil, having commenced in 1933 with observations of precipitation, and temperatures and other variables later in 1936. Thus, it is one of the few stations in Brazil with enough data for long-term climate variability and climate change studies. There is, however, a possibility that its data may have been contaminated by some artifacts over time. Admittedly, there was an intervention on the observations in 1958, with the replacement of instruments, for which the size of impact has not been yet evaluated. The station transformed in the course of time from rural to urban, and this may also have influenced homogeneity of the observations and makes the station less representative for climate studies over larger spatial scales. Homogeneity of the target station was assessed applying both absolute, or single station tests, and tests relatively to regional climate, in annual scale, regarding daily precipitation, relative humidity, maximum (TMax), minimum (TMin), and wet bulb temperatures. Among these quantities, only precipitation does not exhibit any inhomogeneity. A clear signal of change of instruments in 1958 was detected in the TMax and relative humidity data, the latter certainly because of its strong dependence on temperature. This signal is not very clear in TMin, but it presents non-climatic discontinuities around 1953 and around 1970. A significant homogeneity break is found around 1990 for TMax and wet bulb temperature. The discontinuities detected after 1958 may have been caused by urbanization, as the observed warming trend in the station is considerably greater than that corresponding to regional climate.
Resumo:
The Distributed Software Development (DSD) is a development strategy that meets the globalization needs concerned with the increase productivity and cost reduction. However, the temporal distance, geographical dispersion and the socio-cultural differences, increased some challenges and, especially, added new requirements related with the communication, coordination and control of projects. Among these new demands there is the necessity of a software process that provides adequate support to the distributed software development. This paper presents an integrated approach of software development and test that considers distributed teams peculiarities. The approach purpose is to offer support to DSD, providing a better project visibility, improving the communication between the development and test teams, minimizing the ambiguity and difficulty to understand the artifacts and activities. This integrated approach was conceived based on four pillars: (i) to identify the DSD peculiarities concerned with development and test processes, (ii) to define the necessary elements to compose the integrated approach of development and test to support the distributed teams, (iii) to describe and specify the workflows, artifacts, and roles of the approach, and (iv) to represent appropriately the approach to enable the effective communication and understanding of it.
Resumo:
Although the occurrence of glandular trichomes is frequently reported for aerial vegetative organs, many questions still remain opened about the presence of such trichomes in underground systems. Here, we present, for the first time, a comparative study concerning the structure, ultrastructure and chemical aspects of both, the aerial and underground glandular trichomes of two different Chrysolaena species, C obovata and C platensis. Glandular trichomes (GTs) were examined using LM, SEM, and TEM and also analyzed by GC-MS and HPLC coupled to UV/DAD and HR-ESI-MS (HPLC-UV-MS). In both aerial (leaf and bud) and underground (rhizophore) organs, the GTs are multicellular, biseriate and formed by five pairs of cells: a pair of support cells, a pair of basal cells, and three pairs of secreting cells. These secreting cells have, at the beginning of secretory process, abundance of smooth ER. The same classes of secondary metabolites are biosynthesized and stored in both aerial and underground GTs of C platensis and C obovata. These GTs from aerial and underground organs have similar cellular and sub-cellular anatomy, however the belowground trichomes show a higher diversity of compounds when compared to those from the leaves. We also demonstrate by means of HPLC-UV-DAD that the sesquiterpene lactones are located inside the trichomes and that hirsutinolides are not artifacts. (C) 2012 Elsevier GmbH. All rights reserved.
Resumo:
Introduction: The objective of the study was to evaluate the ability of large-volume cone-beam computed tomography (CBCT) to detect horizontal root fracture and to test the influence of a metallic post. Methods: Through the examination of 40 teeth by large-volume CBCT (20-cm height and 15-cm diameter cylinder) at 0.2-mm voxel resolution, 2 observers analyzed the samples for the presence and localization of horizontal root fracture. Results: The values of accuracy in the groups that had no metallic post ranged from 33%-68%, whereas for the samples with the metallic post, values showed a wide variation (38%-83%). Intraobserver agreement showed no statistically significant difference between the groups with/without metallic post; both ranged from very weak to weak (kappa, 0.09-0.369). Conclusions: The low accuracy and low intraobserver and interobserver agreement reflect the difficulty in performing an adequate diagnosis of horizontal root fractures through a large-volume CBCT by using a small voxel reconstruction. (J Endod 2012;38:856-859)
Resumo:
We report a morphology-based approach for the automatic identification of outlier neurons, as well as its application to the NeuroMorpho.org database, with more than 5,000 neurons. Each neuron in a given analysis is represented by a feature vector composed of 20 measurements, which are then projected into a two-dimensional space by applying principal component analysis. Bivariate kernel density estimation is then used to obtain the probability distribution for the group of cells, so that the cells with highest probabilities are understood as archetypes while those with the smallest probabilities are classified as outliers. The potential of the methodology is illustrated in several cases involving uniform cell types as well as cell types for specific animal species. The results provide insights regarding the distribution of cells, yielding single and multi-variate clusters, and they suggest that outlier cells tend to be more planar and tortuous. The proposed methodology can be used in several situations involving one or more categories of cells, as well as for detection of new categories and possible artifacts.
Resumo:
Cone beam computed tomography (CBCT) can be considered as a valuable imaging modality for improving diagnosis and treatment planning to achieve true guidance for several craniofacial surgical interventions. A new concept and perspective in medical informatics is the highlight discussion about the new imaging interactive workflow. The aim of this article was to present, in a short literature review, the usefulness of CBCT technology as an important alternative imaging modality, highlighting current practices and near-term future applications in cutting-edge thought-provoking perspectives for craniofacial surgical assessment. This article explains the state of the art of CBCT improvements, medical workstation, and perspectives of the dedicated unique hardware and software, which can be used from the CBCT source. In conclusion, CBCT technology is developing rapidly, and many advances are on the horizon. Further progress in medical workstations, engineering capabilities, and improvement in independent software-some open source-should be attempted with this new imaging method. The perspectives, challenges, and pitfalls in CBCT will be delineated and evaluated along with the technological developments.
Resumo:
A specific separated-local-field NMR experiment, dubbed Dipolar-Chemical-Shift Correlation (DIPSHIFT) is frequently used to study molecular motions by probing reorientations through the changes in XH dipolar coupling and T-2. In systems where the coupling is weak or the reorientation angle is small, a recoupled variant of the DIPSHIFT experiment is applied, where the effective dipolar coupling is amplified by a REDOR-like pi-pulse train. However, a previously described constant-time variant of this experiment is not sensitive to the motion-induced T-2 effect, which precludes the observation of motions over a large range of rates ranging from hundreds of Hz to around a MHz. We present a DIPSHIFT implementation which amplifies the dipolar couplings and is still sensitive to T-2 effects. Spin dynamics simulations, analytical calculations and experiments demonstrate the sensitivity of the technique to molecular motions, and suggest the best experimental conditions to avoid imperfections. Furthermore, an in-depth theoretical analysis of the interplay of REDOR-like recoupling and proton decoupling based on Average-Hamiltonian Theory was performed, which allowed explaining the origin of many artifacts found in literature data. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
This study explores educational technology and management education by analyzing fidelity in game-based management education interventions. A sample of 31 MBA students was selected to help answer the research question: To what extent do MBA students tend to recognize specific game-based academic experiences, in terms of fidelity, as relevant to their managerial performance? Two distinct game-based interventions (BG1 and BG2) with key differences in fidelity levels were explored: BG1 presented higher physical and functional fidelity levels and lower psychological fidelity levels. Hypotheses were tested with data from the participants, collected shortly after their experiences, related to the overall perceived quality of game-based interventions. The findings reveal a higher overall perception of quality towards BG1: (a) better for testing strategies, (b) offering better business and market models, (c) based on a pace that better stimulates learning, and (d) presenting a fidelity level that better supports real world performance. This study fosters the conclusion that MBA students tend to recognize, to a large extent, that specific game-based academic experiences are relevant and meaningful to their managerial development, mostly with heightened fidelity levels of adopted artifacts. Agents must be ready and motivated to explore the new, to try and err, and to learn collaboratively in order to perform.
Resumo:
Em conformidade com a dúvida de seu título, o difundido livro The New Brutalism: Ethic or Aesthetic? de Reyner Banham não é possível explicar o Brutalismo como manifestação artística coesa, dotada de consistência e reprodutibilidade formal. A citação de arquitetos famosos, porém divergentes, parece associar e comparar obras ásperas com a pretensão de reerguer uma arquitetura moderna considerada ascética, monótona e insuficiente e gera um teoricismo da aparência crua e do moralismo dos objetos. Discurso que oculta, ou desvia a atenção do retorno artístico à sublimidade e construção artesanal. O Brutalismo é aceito como evolução natural dos estágios modernos anteriores e sanciona artefatos toscos, pesados e inacabados como se fossem filiados ao processo moderno desinfestado. Esconde contradições e disfarça seu rompimento com o moderno para prolongar a expressão Movimento moderno. Mas o objeto claro, econômico e preciso é repudiado pelo consumidor e, por ser pouco representativo, o artista faz sua maquiagem com episódios contrastantes e monumentais na informalidade das cidades espontâneas. No entanto, parece possível suspender a noção positiva e corretiva do Brutalismo para entendê-lo como um recuo artístico vulgarizador que despreza aperfeiçoamento e afronta a atitude moderna com banalização conceptiva, exagero, figuralidade, musculação estrutural, grandeza tectônica, rudimento e rudeza. Assim, moralismo, retorno rústico e originalidade desqualificam a expressão International Style entendida como a culminação da arquitetura moderna do pós-guerra, ao depreciá-la como decadente, como produto imobiliário, comercial e corporativo a serviço do capital. Essa interpretação desvela uma crítica anti-industrial, portanto antimodernista e diversa da pós-modernidade, porém contestadora e realista para fornecer imagens à cultura e aos insensíveis à estrutura da forma moderna. Diverso da pós-modernidade pela dependência ao moderno e ausência de apelo popular. Tornada insignificante a configuração oportuna do artefato, o arquiteto tenta reter sua notabilidade artística, ou o prestígio que parece enfraquecer na aparência símile da especificação de catálogo, no rigor modular. Indispõe-se e repudia componentes, Standards e acabamentos impessoais da indústria da construção para insistir em autoria e inspiração, mas repete cacoetes estilísticos de época e o inexplicável uso intensivo de concreto bruto e aparente para sentir-se engajado e atualizado. Porém, é necessário distinguir obras de aparência severa concebidas pela atitude moderna mais autêntica das de concreto aparente em tipos ou configurações aberrantes. Para avançar na discussão do Brutalismo propõe-se entender este fenômeno com a substituição do juízo estético moderno de sentido visual postulado por Immanuel Kant (1724-1804) por um sentimento estético fácil e relacionado com a sensação da empatia, com a Einfühlung de Robert Vischer (1847-1933). Na época da cultura de massas, admite-se o rebaixamento das exigências no artefato e a adaptação brutalista com a transfiguração dos processos de arquitetura moderna. Assim, a forma é substituída pela figura ou pelo resumo material; a estrutura formal subjacente pelo ritmo e exposição da estrutura física; o reconhecimento visual pelo entusiasmo psicológico ou pelo impulso dionisíaco; a concepção substituída pelo partido, ou, ainda, pelo conceito; a sistematização e a ordem pela moldagem e a organização; a abstração e síntese pela originalidade e essencialidade, o sentido construtivo pela honestidade material; a identidade das partes pela fundição ou pela unicidade objetal e a residência pela cabana primitiva.
Resumo:
As evidências arqueológicas encontradas ao longo do litoral brasileiro atestam que essa área era ocupada, desde, pelo menos, 8.000 anos AP, por grupos pescadores coletores que exploravam os ambientes aquáticos costeiros. Embora a comunidade científica acredite que os sambaquieiros fossem exímios navegadores, evidências a esse respeito ainda são raras. Neste artigo, a partir de uma abordagem focada na Arqueologia Marítima, são apresentados argumentos, hipóteses e evidências que discutem o entendimento de que, além de uma forte relação econômica e simbólica com os ambientes aquáticos, os povos dos sambaquis se apropriaram de ou desenvolveram técnicas de navegação e artefatos náuticos.
Resumo:
We describe the planning, implementation, and initial results of the first planned move of the default position of spectra on the Hubble Space Telescope's Cosmic Origins Spectrograph (COS) Far Ultraviolet (FUV) cross-delay line detector. This was motivated by the limited amount of charge that can be extracted from the microchannel plate due to gain sag at any one position. Operations at a new location began on July 23, 2012, with a shift of the spectrum by +3.5"(corresponding to ~ 41 pixels or ~ 1 mm) in a direction orthogonal to the spectral dispersion. Operation at this second "lifetime position" allows for spectra to be collected which are not affected by detector artifacts and loss of sensitivity due to gain sag. We discuss programs designed to enable operations at the new lifetime position; these include determinations of operational high voltage, measuring walk corrections and focus, confirming spectrum placement and aperture centering, and target acquisition performance. We also present results related to calibration of the new lifetime position, including measurements of spectral resolution and wavelength calibration, flux and flat field calibration, carryover of time-dependent sensitivity monitoring, and operations with the Bright Object Aperture (BOA).
Resumo:
Background: Few data on the definition of simple robust parameters to predict image noise in cardiac computed tomography (CT) exist. Objectives: To evaluate the value of a simple measure of subcutaneous tissue as a predictor of image noise in cardiac CT. Methods: 86 patients underwent prospective ECG-gated coronary computed tomographic angiography (CTA) and coronary calcium scoring (CAC) with 120 kV and 150 mA. The image quality was objectively measured by the image noise in the aorta in the cardiac CTA, and low noise was defined as noise < 30HU. The chest anteroposterior diameter and lateral width, the image noise in the aorta and the skin-sternum (SS) thickness were measured as predictors of cardiac CTA noise. The association of the predictors and image noise was performed by using Pearson correlation. Results: The mean radiation dose was 3.5 ± 1.5 mSv. The mean image noise in CT was 36.3 ± 8.5 HU, and the mean image noise in non-contrast scan was 17.7 ± 4.4 HU. All predictors were independently associated with cardiac CTA noise. The best predictors were SS thickness, with a correlation of 0.70 (p < 0.001), and noise in the non-contrast images, with a correlation of 0.73 (p < 0.001). When evaluating the ability to predict low image noise, the areas under the ROC curve for the non-contrast noise and for the SS thickness were 0.837 and 0.864, respectively. Conclusion: Both SS thickness and CAC noise are simple accurate predictors of cardiac CTA image noise. Those parameters can be incorporated in standard CT protocols to adequately adjust radiation exposure.
Resumo:
The behavior of composed Web services depends on the results of the invoked services; unexpected behavior of one of the invoked services can threat the correct execution of an entire composition. This paper proposes an event-based approach to black-box testing of Web service compositions based on event sequence graphs, which are extended by facilities to deal not only with service behavior under regular circumstances (i.e., where cooperating services are working as expected) but also with their behavior in undesirable situations (i.e., where cooperating services are not working as expected). Furthermore, the approach can be used independently of artifacts (e.g., Business Process Execution Language) or type of composition (orchestration/choreography). A large case study, based on a commercial Web application, demonstrates the feasibility of the approach and analyzes its characteristics. Test generation and execution are supported by dedicated tools. Especially, the use of an enterprise service bus for test execution is noteworthy and differs from other approaches. The results of the case study encourage to suggest that the new approach has the power to detect faults systematically, performing properly even with complex and large compositions. Copyright © 2012 John Wiley & Sons, Ltd.