850 resultados para High-dimensional data visualization
Resumo:
This work presents the results of a study on the use of experimentation in physics teaching, presenting a survey on the use of this strategy in public high schools in the region of São José dos Campos. Data collection was carried out in schools by means of questionnaires prepared for physics teachers and students from three grades of high school. Data were obtained in eighteen schools distributed in the city of São José dos Campos (two in the central region, in the west, three on the east side, seven in the north and five in the south), a school in the district of Sao Francisco Xavier and a school in the town of Monteiro Lobato. Thus, data from 20 schools, 610 students and 20 teachers were analyzed. Among the main results, we highlight that over 80% of the students said that there is no physics lab at the school where they study, and less than 1% declares that uses laboratory weekly. We note that there is a laboratory in 25% of the schools in the northern region and 10% or less of the other schools. According to the students the proposition of the experimental activities by teachers is rare - only three schools in East Side the options sometimes and always exceeded the number of responses experimental activities are never proposed. We had 60% of the teachers who reported using the curriculum of the state of São Paulo to propose activities to students, and half of the teachers said they did not have classes experimental physics classes in initial training or had them in sufficient quantity to support the use of this feature in practice
Resumo:
The Advanced LIGO gravitational wave detectors are second-generation instruments designed and built for the two LIGO observatories in Hanford, WA and Livingston, LA, USA. The two instruments are identical in design, and are specialized versions of a Michelson interferometer with 4 km long arms. As in Initial LIGO, Fabry-Perot cavities are used in the arms to increase the interaction time with a gravitational wave, and power recycling is used to increase the effective laser power. Signal recycling has been added in Advanced LIGO to improve the frequency response. In the most sensitive frequency region around 100 Hz, the design strain sensitivity is a factor of 10 better than Initial LIGO. In addition, the low frequency end of the sensitivity band is moved from 40 Hz down to 10 Hz. All interferometer components have been replaced with improved technologies to achieve this sensitivity gain. Much better seismic isolation and test mass suspensions are responsible for the gains at lower frequencies. Higher laser power, larger test masses and improved mirror coatings lead to the improved sensitivity at mid and high frequencies. Data collecting runs with these new instruments are planned to begin in mid-2015.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The objective of this paper is to verify and analyze the existence in Brazil of stylized facts observed in financial time series: volatility clustering, probability distributions with fat tails, the presence of long run memory in absolute return time series, absence of linear return autocorrelation, gain/loss asymmetry, aggregative gaussianity, slow absolute return autocorrelation decay, trading volume/volatility correlation and leverage effect. We analyzed intraday prices for 10 stocks traded at the BM&FBovespa, responsible for 52.1% of the Ibovespa portfolio on Sept. 01, 2009. The data analysis confirms the stylized facts, whose behavior is consistent with what is observed in international markets.
Resumo:
This paper explores the benefits of using immersive and interactive multiprojection environments (CAVE) to visualize molecules, and how it improves users’ understanding. We have proposed and implemented a tool for teachers to manipulate molecules and another to edit molecules and assist students at home. The contribution of the present research project are these tool that allows investigating structures, properties and dynamics of a molecular system which are extremely complex and comprises millions of atoms. The experience is enriched through multimedia information associated with parts of the model; for example, videos and text can be linked to specific molecule, demonstrating some detail. This solution is based on a teaching-learning process.
Resumo:
Accruing evidence indicates that connexin (Cx) channels in the gap junctions (GJ) are involved in neurodegeneration after injury. However, studies using KO animal models endowed apparently contradictory results in relation to the role of coupling in neuroprotection. We analyzed the role of Cx-mediated communication in a focal lesion induced by mechanical trauma of the retina, a model that allows spatial and temporal definition of the lesion with high reproducibility, permitting visualization of the focus, penumbra and adjacent areas. Cx36 and Cx43 exhibited distinct gene expression and protein levels throughout the neurodegeneration progress. Cx36 was observed close to TUNEL-positive nuclei, revealing the presence of this protein surrounding apoptotic cells. The functional role of cell coupling was assessed employing GJ blockers and openers combined with lactate dehydrogenase (LDH) assay, a direct method for evaluating cell death/viability. Carbenoxolone (CBX), a broad-spectrum GJ blocker, reduced LDH release after 4 hours, whereas quinine, a Cx36-channel specific blocker, decreased LDH release as early as 1 hour after lesion. Furthermore, analysis of dying cell distribution confirmed that the use of GJ blockers reduced apoptosis spread. Accordingly, blockade of GJ communication during neurodegeneration with quinine, but not CBX, caused downregulation of initial and effector caspases. To summarize, we observed specific changes in Cx gene expression and protein distribution during the progress of retinal degeneration, indicating the participation of these elements in acute neurodegeneration processes. More importantly, our results revealed that direct control of GJ channels permeability may take part in reliable neuroprotection strategies aimed to rapid, fast treatment of mechanical trauma in the retina.
Resumo:
Thermal treatment (thermal rectification) is a process in which technological properties of wood are modified using thermal energy, the result of Which is often value-added wood. Thermally treated wood takes on similar color shades to tropical woods and offers considerable resistance to destructive microorganisms and climate action, in addition to having high dimensional stability and low hygroscopicity. Wood samples of Eucalyptus grandis were subjected to various thermal treatments, as performed in presence (140 degrees C; 160 degrees C; 180 degrees C) or in absence of oxygen (160 degrees C; 180 degrees C; 200 degrees C) inside a thermal treatment chamber, and then studied as to their chemical characteristics. Increasing the maximum treatment temperatures led to a reduction in the holocellulose content of samples as a result of the degradation and volatilization of hemicelluloses, also leading to an increase in the relative lignin content. Except for glucose, all monosaccharide levels were found to decrease in samples after the thermal treatment at a maximum temperature of 200 degrees C. The thermal treatment above 160 degrees C led to increased levels of total extractives in the wood samples, probably ascribed to the emergence of low molecular weight substances as a result of thermal degradation. Overall, it was not possible to clearly determine the effect of presence or absence of oxygen in the air during thermal treatment on the chemical characteristics of the relevant wood samples.
Resumo:
This article investigates the effect of product market liberalisation on employment allowing for interactions between policies and institutions in product and labour markets. Using panel data for OECD countries over the period 19802002, we present evidence that product market deregulation is more effective at the margin when labour market regulation is high. The data also suggest that product market liberalisation may promote employment-enhancing labour market reforms.
Resumo:
Abstract Background Oral squamous cell carcinoma (OSCC) is a frequent neoplasm, which is usually aggressive and has unpredictable biological behavior and unfavorable prognosis. The comprehension of the molecular basis of this variability should lead to the development of targeted therapies as well as to improvements in specificity and sensitivity of diagnosis. Results Samples of primary OSCCs and their corresponding surgical margins were obtained from male patients during surgery and their gene expression profiles were screened using whole-genome microarray technology. Hierarchical clustering and Principal Components Analysis were used for data visualization and One-way Analysis of Variance was used to identify differentially expressed genes. Samples clustered mostly according to disease subsite, suggesting molecular heterogeneity within tumor stages. In order to corroborate our results, two publicly available datasets of microarray experiments were assessed. We found significant molecular differences between OSCC anatomic subsites concerning groups of genes presently or potentially important for drug development, including mRNA processing, cytoskeleton organization and biogenesis, metabolic process, cell cycle and apoptosis. Conclusion Our results corroborate literature data on molecular heterogeneity of OSCCs. Differences between disease subsites and among samples belonging to the same TNM class highlight the importance of gene expression-based classification and challenge the development of targeted therapies.
Resumo:
Thermal treatment (thermal rectification) is a process in which technological properties of wood are modified using thermal energy, the result of which is often value-added wood. Thermally treated wood takes on similar color shades to tropical woods and offers considerable resistance to destructive microorganisms and climate action, in addition to having high dimensional stability and low hygroscopicity. Wood samples of Eucalyptus grandis were subjected to various thermal treatments, as performed in presence (140ºC; 160ºC; 180ºC) or in absence of oxygen (160ºC; 180ºC; 200ºC) inside a thermal treatment chamber, and then studied as to their chemical characteristics. Increasing the maximum treatment temperatures led to a reduction in the holocellulose content of samples as a result of the degradation and volatilization of hemicelluloses, also leading to an increase in the relative lignin content. Except for glucose, all monosaccharide levels were found to decrease in samples after the thermal treatment at a maximum temperature of 200ºC. The thermal treatment above 160ºC led to increased levels of total extractives in the wood samples, probably ascribed to the emergence of low molecular weight substances as a result of thermal degradation. Overall, it was not possible to clearly determine the effect of presence or absence of oxygen in the air during thermal treatment on the chemical characteristics of the relevant wood samples.
Resumo:
[EN] This paper describes a wildfire forecasting application based on a 3D virtual environment and a fire simulation engine. A new open source framework is presented for the development of 3D graphics applications over large geographic areas offering high performance 3D visualization and powerful interaction tools for the Geographic Information Systems community. The application includes a remote module that allows simultaneous connection of several users for monitoring a real wildfire event. The user is enabled to simulate and visualize a wildfire spreading on the terrain under conditions of spatial information on topography and fuels along with weather and wind files.
Resumo:
[EN] This abstract describes the development of a wildfire forecasting plugin using Capaware. Capaware is designed as an easy to use open source framework to develop 3D graphics applications over large geographic areas offering high performance 3D visualization and powerful interaction tools for the Geographic Information Systems (GIS) community.
Resumo:
The research is part of a survey for the detection of the hydraulic and geotechnical conditions of river embankments funded by the Reno River Basin Regional Technical Service of the Region Emilia-Romagna. The hydraulic safety of the Reno River, one of the main rivers in North-Eastern Italy, is indeed of primary importance to the Emilia-Romagna regional administration. The large longitudinal extent of the banks (several hundreds of kilometres) has placed great interest in non-destructive geophysical methods, which, compared to other methods such as drilling, allow for the faster and often less expensive acquisition of high-resolution data. The present work aims to experience the Ground Penetrating Radar (GPR) for the detection of local non-homogeneities (mainly stratigraphic contacts, cavities and conduits) inside the Reno River and its tributaries embankments, taking into account supplementary data collected with traditional destructive tests (boreholes, cone penetration tests etc.). A comparison with non-destructive methodologies likewise electric resistivity tomography (ERT), Multi-channels Analysis of Surface Waves (MASW), FDEM induction, was also carried out in order to verify the usability of GPR and to provide integration of various geophysical methods in the process of regular maintenance and check of the embankments condition. The first part of this thesis is dedicated to the explanation of the state of art concerning the geographic, geomorphologic and geotechnical characteristics of Reno River and its tributaries embankments, as well as the description of some geophysical applications provided on embankments belonging to European and North-American Rivers, which were used as bibliographic basis for this thesis realisation. The second part is an overview of the geophysical methods that were employed for this research, (with a particular attention to the GPR), reporting also their theoretical basis and a deepening of some techniques of the geophysical data analysis and representation, when applied to river embankments. The successive chapters, following the main scope of this research that is to highlight advantages and drawbacks in the use of Ground Penetrating Radar applied to Reno River and its tributaries embankments, show the results obtained analyzing different cases that could yield the formation of weakness zones, which successively lead to the embankment failure. As advantages, a considerable velocity of acquisition and a spatial resolution of the obtained data, incomparable with respect to other methodologies, were recorded. With regard to the drawbacks, some factors, related to the attenuation losses of wave propagation, due to different content in clay, silt, and sand, as well as surface effects have significantly limited the correlation between GPR profiles and geotechnical information and therefore compromised the embankment safety assessment. Recapitulating, the Ground Penetrating Radar could represent a suitable tool for checking up river dike conditions, but its use has significantly limited by geometric and geotechnical characteristics of the Reno River and its tributaries levees. As a matter of facts, only the shallower part of the embankment was investigate, achieving also information just related to changes in electrical properties, without any numerical measurement. Furthermore, GPR application is ineffective for a preliminary assessment of embankment safety conditions, while for detailed campaigns at shallow depth, which aims to achieve immediate results with optimal precision, its usage is totally recommended. The cases where multidisciplinary approach was tested, reveal an optimal interconnection of the various geophysical methodologies employed, producing qualitative results concerning the preliminary phase (FDEM), assuring quantitative and high confidential description of the subsoil (ERT) and finally, providing fast and highly detailed analysis (GPR). Trying to furnish some recommendations for future researches, the simultaneous exploitation of many geophysical devices to assess safety conditions of river embankments is absolutely suggested, especially to face reliable flood event, when the entire extension of the embankments themselves must be investigated.
Resumo:
Zusammenfassung Mittels Fluoreszenzfarbstoffen können Strukturen sichtbar gemacht werden, die auf kon-ventionellem Weg nicht, oder nur schwer darzustellen sind. Besonders in Kombination mit der Konfokalen Laser Scanning Mikroskopie eröffnen sich neue Wege zum spezifischen Nachweis unterschiedlichster Komponenten biologischer Proben und gegebenenfalls deren dreidimensionale Widergabe.Die Visualisierung des Proteinanteils des Zahnhartgewebes kann mit Hilfe chemisch kopplungsfähiger Fluorochrome durchgeführt werden. Um zu zeigen, daß es sich bei dieser Markierung nicht um unspezifische Adsorption des Farbstoffes handelt, wurde zur Kontrolle die Proteinkomponente der Zahnproben durch enzymatischen Verdau beseitigt. Derartig behandelte Präparate wiesen eine sehr geringe Anfärbbarkeit auf.Weiterführend diente diese enzymatische Methode als Negativkontrolle zum Nachweis der Odontoblastenfortsätze im Dentin bzw. im Bereich der Schmelz-Dentin-Grenze. Hiermit konnte differenziert werden zwischen reinen Reflexionsbildern der Dentinkanäle und den Zellausläufern deren Membranen gezielt durch lipophile Fluoreszenzfarbstoffe markiert wurden.In einem weiteren Ansatz konnte gezeigt werden, daß reduzierte und daher nichtfluoreszente Fluoresceinabkömmlinge geeignet sind, die Penetration von Oxidationsmitteln (hier H2O2) in den Zahn nachzuweisen. Durch Oxidation dieser Verbindungen werden fluoreszierende Produkte generiert, die den Nachweis lieferten, daß die als Zahnbleichmittel eingesetzten Mittel rasch durch Schmelz und Dentin bis in die Pulpahöhle gelangen können.Die Abhängigkeit der Fluoreszenz bestimmter Fluorochrome von deren chemischer Um-gebung, im vorliegenden Fall dem pH-Wert, sollte eingesetzt werden, um den Säuregrad im Zahninneren fluoreszenzmikroskopisch darzustellen. Hierbei wurde versucht, ein ratio-metrisches Verfahren zu entwickeln, mit dem die pH-Bestimmung unter Verwendung eines pH-abhängigen und eines pH-unabhängigen Fluorochroms erfolgt. Diese Methode konnte nicht für diese spezielle Anwendung verifiziert werden, da Neutralisationseffekte der mineralischen Zahnsubstanz (Hydroxylapatit) die pH-Verteilung innerhalb der Probe beeinflußen. Fluoreszenztechniken wurden ebenfalls ergänzend eingesetzt zur Charakterisierung von kovalent modifizierten Implantatoberflächen. Die, durch Silanisierung von Titantestkörpern mit Triethoxyaminopropylsilan eingeführten freien Aminogruppen konnten qualitativ durch den Einsatz eines aminspezifischen Farbstoffes identifiziert werden. Diese Art der Funktionalisierung dient dem Zweck, Implantatoberflächen durch chemische Kopplung adhäsionsvermittelnder Proteine bzw. Peptide dem Einheilungsprozeß von Implantaten in den Knochen zugänglicher zu machen, indem knochenbildende Zellen zu verbessertem Anwachsverhalten stimuliert werden. Die Zellzahlbestimmung im Adhäsionstest wurde ebenfalls mittels Fluoreszenzfarbstoffen durchgeführt und lieferte Ergebnisse, die belegen, daß die durchgeführte Modifizierung einen günstigen Einfluß auf die Zelladhäsion besitzt.
Resumo:
Im Jahre 2002 wurde mit dem NA48/1-Detektor eine Datennahme mit hoher Intensität von K_S-Mesonen und neutralen Hyperonen durchgeführt, bei der unter anderem etwa 10^9 Xi^0-Zerfallskandidaten aufgezeichnet wurden. Im Rahmen dieser Arbeit wurden aus diesem Datensatz 6657 Xi^0 -> Sigma^+ e^- Anti-nü und 581 Anti-Xi^0 -> Anti-Sigma^+ e^+ nü-Ereignisse ausgewählt und damit die Verzweigungsverhältnisse BR1(Gamma(Xi^0 -> Sigma^+ e^- Anti-nü)/Gamma(Xi^0 total))=( 2.533 +-0.032(stat) -0.076+0.089(syst) )10^-4 und BR2(Gamma(Anti-Xi^0 -> Anti-Sigma^+ e^+ nü)/Gamma(Anti-Xi^0 total))= ( 2.57 +-0.12(stat) -0.09+0.10(syst) )10^-4 bestimmt. Dieses Ergebnis für BR1 ist etwa 3.5-mal genauer als die bisher veröffentlichte Messung. Die Analyse der Anti-Xi^0-Beta-Zerfälle stellt die erste Messung von BR2 dar. Beide Ergebnisse stimmen mit der theoretischen Vorhersage von 2.6*10^-4 überein. Aus dem Xi^0-Beta-Verzweigungsverhältnis folgt unter Verwendung des experimentellen Wertes des Formfaktorverhältnisses g1/f1 für das CKM-Matrixelement |Vus| = 0.209 +- 0.004(exp) +- 0.026(syst), wobei die dominierende Unsicherheit von g1/f1 herrührt. Außerdem wurden in dieser Arbeit 99 Xi^0 -> Sigma^+ mu^- Anti-nü Zerfallskandidaten mit einem abgeschätzten Untergrund von 30 Ereignissen rekonstruiert und daraus ebenfalls das Verzweigungsverhältnis extrahiert: BR3(Gamma(Xi^0 -> Sigma^+ mu^- Anti-nü)/Gamma(Xi^0 total)) = ( 2.11 +- 0.31(stat) +- 0.15(syst) )10^-6.