624 resultados para logs


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The results of inductively coupled argon plasma (ICAP) chemical analyses carried out on some 300 core samples from Ocean Drilling Program Sites 834, 835, 838, and 839 are presented. These sites were drilled during Leg 135 in the Lau Basin. The data are compared with total gamma (SGR) wireline logs at Sites 834 and 835. Pliocene (Piacenzian) nannofossil Zone CN12, which has been identified at Sites 834 and 835, is examined in detail using spectral analyses on core and wireline logs. The potassium and calcium concentrations from the core material were used to calculate an objective depth-to-geological time stretching function, which improved the stratigraphic correlation between sites. The integrated use of chemical analyses, wireline-log data and paleomagnetic results improved confidence in the correlations obtained. Although no significant sedimentation periodicities were obtained from the two sites, a common concentration of energy between 30 and 60 k.y. was recorded.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Geochemical well logs were used to measure the dry weight percent oxide abundances of Si, Al, Ca, Mg, Fe, Ti, and K and the elemental abundances of Gd, S, Th, and U at 0.15-m intervals throughout the basement section of Hole 504B. These geochemical data are used to estimate the integrated chemical exchange resulting from hydrothermal alteration of the oceanic crust that has occurred over the last 5.9 Ma. A large increase in Si in the transition zone between pillows and dikes (Layers 2B and 2C) indicates that mixing of hot, upwelling hydrothermal fluids with cold, downwelling seawater occurred in the past at a permeability discontinuity at this level in the crust, even though the low-to-high permeability boundary in Hole 504B is now 500 m shallower (at the Layer 2A/2B boundary). The observations of extensive Ca loss and Mg gain agree with chemical exchanges recorded in the laboratory in experiments on the reactions that occur between basalt and seawater at high temperatures. The K budget requires significant addition to Layer 2A from both high-temperature depletion in Layers 2B and 2C and low-temperature alteration by seawater. Integrated water/rock ratios are derived for the mass of seawater required to add enriched elements and for the mass of hydrothermal fluid required to remove depleted elements in the crust at Hole 504B.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cores and dredges described are taken during the R/V Argo ZETES Expedition from March until August 1966 by the Scripps Institute of Oceanography. A total of 53 cores and dredges were recovered and are available at Scripps Institute of Oceanography for sampling and study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cores and dredges described are taken during the R/V New Horizon RISE III Expedition from April until May 1979 by the Scripps Institute of Oceanography. A total of 36 cores and dredges were recovered and are available at Scripps Institute of Oceanography for sampling and study. The goal of this expedition was to accomplish the field work of a combined field and laboratory study aimed at increasing the understanding of the origins and development of young oceanic volcanoes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Site 1103 was one of a transect of three sites drilled across the Antarctic Peninsula continental shelf during Leg 178. The aim of drilling on the shelf was to determine the age of the sedimentary sequences and to ground truth previous interpretations of the depositional environment (i.e., topsets and foresets) of progradational seismostratigraphic sequences S1, S2, S3, and S4. The ultimate objective was to obtain a better understanding of the history of glacial advances and retreats in this west Antarctic margin. Drilling the topsets of the progradational wedge (0-247 m below seafloor [mbsf]), which consist of unsorted and unconsolidated materials of seismic Unit S1, was very unfavorable, resulting in very low (2.3%) core recovery. Recovery improved (34%) below 247 mbsf, corresponding to sediments of seismic Unit S3, which have a consolidated matrix. Logs were only obtained from the interval between 75 and 244 mbsf, and inconsistencies on the automatic analog picking of the signals received from the sonic log at the array and at the two other receivers prevented accurate shipboard time-depth conversions. This, in turn, limited the capacity for making seismic stratigraphic interpretations at this site and regionally. This study is an attempt to compile all available data sources, perform quality checks, and introduce nonstandard processing techniques for the logging data obtained to arrive at a reliable and continuous depth vs. velocity profile. We defined 13 data categories using differential traveltime information. Polynomial exclusion techniques with various orders and low-pass filtering reduced the noise of the initial data pool and produced a definite velocity depth profile that is synchronous with the resistivity logging data. A comparison of the velocity profile produced with various other logs of Site 1103 further validates the presented data. All major logging units are expressed within the new velocity data. A depth-migrated section with the new velocity data is presented together with the original time section and initial depth estimates published within the Leg 178 Initial Reports volume. The presented data confirms the location of the shelf unconformity at 222 ms two-way traveltime (TWT), or 243 mbsf, and allows its seismic identification as a strong negative and subsequent positive reflection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past few years, logging has evolved from from simple printf statements to more complex and widely used logging libraries. Today logging information is used to support various development activities such as fixing bugs, analyzing the results of load tests, monitoring performance and transferring knowledge. Recent research has examined how to improve logging practices by informing developers what to log and where to log. Furthermore, the strong dependence on logging has led to the development of logging libraries that have reduced the intricacies of logging, which has resulted in an abundance of log information. Two recent challenges have emerged as modern software systems start to treat logging as a core aspect of their software. In particular, 1) infrastructural challenges have emerged due to the plethora of logging libraries available today and 2) processing challenges have emerged due to the large number of log processing tools that ingest logs and produce useful information from them. In this thesis, we explore these two challenges. We first explore the infrastructural challenges that arise due to the plethora of logging libraries available today. As systems evolve, their logging infrastructure has to evolve (commonly this is done by migrating to new logging libraries). We explore logging library migrations within Apache Software Foundation (ASF) projects. We i find that close to 14% of the pro jects within the ASF migrate their logging libraries at least once. For processing challenges, we explore the different factors which can affect the likelihood of a logging statement changing in the future in four open source systems namely ActiveMQ, Camel, Cloudstack and Liferay. Such changes are likely to negatively impact the log processing tools that must be updated to accommodate such changes. We find that 20%-45% of the logging statements within the four systems are changed at least once. We construct random forest classifiers and Cox models to determine the likelihood of both just-introduced and long-lived logging statements changing in the future. We find that file ownership, developer experience, log density and SLOC are important factors in determining the stability of logging statements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two decades of unprecedented changes in the media landscape have increased the complexity of informing the public through news media. With significant changes to the way the news industry does business and the way news consumers access this information, a new set of skills is being proposed as essential for today’s news consumer. News literacy is the use of critical thinking skills to assess the reliability and source of the information that people consume on a daily basis, as well as fostering self-awareness of personal news consumption habits and how it can create audience bias. The purpose of this study was to examine how adults experience the news in their everyday lives and to describe the nature of the news literacy skills people employ in their daily news consumption. This study purposefully selected four adults who have completed high school, and who regularly consume news information across a number of platforms, both traditional and digital. Two of the participants, one man and one woman, were over 50 years old. One other male participant was in his 30’s and the final participant, a young woman, was in her 20’s. They all utilized both traditional and digital media on a regular basis and all had differing skill levels when using social media for information. Their news experiences were documented by in-depth interviews and the completion of seven daily news logs. In their daily logs the participants differentiated news information from other information available on-line but the interviews revealed a contradiction between their intentions and their news consumption practices. All four participants had trouble distinguishing between news and opinion pieces in the news information realm. In addition all but one seemed unaware of their personal bias and any possible effect it was having on their news consumption. Further research should explore the benefits of an adult-centered news literacy curriculum on news consumers similar to the participants, and should examine the development of audience bias and its relationship to the daily exposure people have to the torrent of information that is available to them on a daily basis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work was to track and verify the delivery of respiratory-gated irradiations, performed with three versions of TrueBeam linac, using a novel phantom arrangement that combined the OCTAVIUS® SRS 1000 array with a moving platform. The platform was programmed to generate sinusoidal motion of the array. This motion was tracked using the real-time position management (RPM) system and four amplitude gating options were employed to interrupt MV beam delivery when the platform was not located within set limits. Time-resolved spatial information extracted from analysis of x-ray fluences measured by the array was compared to the programmed motion of the platform and to the trace recorded by the RPM system during the delivery of the x-ray field. Temporal data recorded by the phantom and the RPM system were validated against trajectory log files, recorded by the linac during the irradiation, as well as oscilloscope waveforms recorded from the linac target signal. Gamma analysis was employed to compare time-integrated 2D x-ray dose fluences with theoretical fluences derived from the probability density function for each of the gating settings applied, where gamma criteria of 2%/2 mm, 1%/1 mm and 0.5%/0.5 mm were used to evaluate the limitations of the RPM system. Excellent agreement was observed in the analysis of spatial information extracted from the SRS 1000 array measurements. Comparisons of the average platform position with the expected position indicated absolute deviations of  <0.5 mm for all four gating settings. Differences were observed when comparing time-resolved beam-on data stored in the RPM files and trajectory logs to the true target signal waveforms. Trajectory log files underestimated the cycle time between consecutive beam-on windows by 10.0  ±  0.8 ms. All measured fluences achieved 100% pass-rates using gamma criteria of 2%/2 mm and 50% of the fluences achieved pass-rates  >90% when criteria of 0.5%/0.5 mm were used. Results using this novel phantom arrangement indicate that the RPM system is capable of accurately gating x-ray exposure during the delivery of a fixed-field treatment beam.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pour rester compétitives, les entreprises forestières cherchent à contrôler leurs coûts d’approvisionnement. Les abatteuses-façonneuses sont pourvues d’ordinateurs embarqués qui permettent le contrôle et l’automatisation de certaines fonctions. Or, ces technologies ne sont pas couramment utilisées et sont dans le meilleur des cas sous-utilisées. Tandis que l’industrie manifeste un intérêt grandissant pour l’utilisation de ces ordinateurs, peu de travaux de recherche ont porté sur l’apport en productivité et en conformité aux spécifications de façonnage découlant de l’usage de ces systèmes. L’objectif de l’étude était de mesurer les impacts des trois degrés d’automatisation (manuel, semi-automatique et automatique) sur la productivité (m3/hmp) et le taux de conformité des longueurs et des diamètre d’écimage des billes façonnées (%). La collecte de données s’est déroulée dans les secteurs de récolte de Produits forestiers résolu au nord du Lac St-Jean entre les mois de janvier et d’août 2015. Un dispositif en blocs complets a été mis en place pour chacun des cinq opérateurs ayant participé à l’étude. Un seuil de 5 % a été employé pour la réalisation de l’analyse des variances, après la réalisation de contrastes. Un seul cas a présenté un écart significatif de productivité attribuable au changement du degré d’automatisation employé, tandis qu’aucune différence significative n’a été détectée pour la conformité des diamètres d’écimage; des tendances ont toutefois été constatées. Les conformités de longueur obtenues par deux opérateurs ont présenté des écarts significatifs. Ceux-ci opérant sur deux équipements distincts, cela laisse entrevoir l’impact que peut aussi avoir l’opérateur sur le taux de conformité des longueurs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When thinking what paintings are, I am continually brought back to my memory of a short sequence in Alfred Hitchcock’s Vertigo. In the scene, Kim Novak’s Madeleine is seated on a bench in an art gallery. She is apparently transfixed by a painting, Portrait of Carlotta. Alongside James Stewart, we watch her looking intently. Madeleine is pretending to be a ghost. At this stage she does not expect us to believe she is a ghost, but simply to immerse ourselves in the conceit, to delight in the shudder. Madeleine’s back is turned away from us, and as the camera draws near to show that the knot pattern in her hair mirrors the image in the portrait, I imagine Madeleine suppressing a smile. She resolutely shows us her back, though, so her feint is not betrayed. Madeleine’s stillness in this scene makes her appear as an object, a thing in the world, a rock or a pile of logs perhaps. We are not looking at that thing, however, but rather a residual image of something creaturely, a spectre. This after-image is held to the ground both by the gravity suggested by its manifestation and by the fine lie - the camouflage - of pretending to be a ghost. Encountering a painting is like meeting Madeleine. It sits in front of its own picture, gazing at it. Despite being motionless and having its back to us, there is a lurching sensation the painting brings about by pretending to be the ghost of its picture, and, at the same time, never really anticipating your credulity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the dramatic growth of text information, there is an increasing need for powerful text mining systems that can automatically discover useful knowledge from text. Text is generally associated with all kinds of contextual information. Those contexts can be explicit, such as the time and the location where a blog article is written, and the author(s) of a biomedical publication, or implicit, such as the positive or negative sentiment that an author had when she wrote a product review; there may also be complex context such as the social network of the authors. Many applications require analysis of topic patterns over different contexts. For instance, analysis of search logs in the context of the user can reveal how we can improve the quality of a search engine by optimizing the search results according to particular users; analysis of customer reviews in the context of positive and negative sentiments can help the user summarize public opinions about a product; analysis of blogs or scientific publications in the context of a social network can facilitate discovery of more meaningful topical communities. Since context information significantly affects the choices of topics and language made by authors, in general, it is very important to incorporate it into analyzing and mining text data. In general, modeling the context in text, discovering contextual patterns of language units and topics from text, a general task which we refer to as Contextual Text Mining, has widespread applications in text mining. In this thesis, we provide a novel and systematic study of contextual text mining, which is a new paradigm of text mining treating context information as the ``first-class citizen.'' We formally define the problem of contextual text mining and its basic tasks, and propose a general framework for contextual text mining based on generative modeling of text. This conceptual framework provides general guidance on text mining problems with context information and can be instantiated into many real tasks, including the general problem of contextual topic analysis. We formally present a functional framework for contextual topic analysis, with a general contextual topic model and its various versions, which can effectively solve the text mining problems in a lot of real world applications. We further introduce general components of contextual topic analysis, by adding priors to contextual topic models to incorporate prior knowledge, regularizing contextual topic models with dependency structure of context, and postprocessing contextual patterns to extract refined patterns. The refinements on the general contextual topic model naturally lead to a variety of probabilistic models which incorporate different types of context and various assumptions and constraints. These special versions of the contextual topic model are proved effective in a variety of real applications involving topics and explicit contexts, implicit contexts, and complex contexts. We then introduce a postprocessing procedure for contextual patterns, by generating meaningful labels for multinomial context models. This method provides a general way to interpret text mining results for real users. By applying contextual text mining in the ``context'' of other text information management tasks, including ad hoc text retrieval and web search, we further prove the effectiveness of contextual text mining techniques in a quantitative way with large scale datasets. The framework of contextual text mining not only unifies many explorations of text analysis with context information, but also opens up many new possibilities for future research directions in text mining.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The hydrodynamics and hydrochemistry of salt and fresh water from solid rock aquifer systems in the Pyrmont area are described and interpreted on the basis of recent investigations including geoelectrics, isotope hydrology, soil air analysis. Theories on the source of the springs in this area are developed, which explain the different compositions of the springs and make it possible to protect them. Data from new and re-interpretated drill holes, borehole logs and outcrops suggest a revision of the geological structure of the Pyrmont dome. Bad Pyrmont is situated on a wide dome of Triassic rocks in the southern part of the Lower Saxony uplands. Inversion of the relief has caused the development of an erosional basin surrounded by prominent ridges. Deep faults developed at the crest of the dome as this part of the structure was subjected to the strongest tectonic stress. Subrosion of the Zechstein salts in the western part of the dome has caused the main salt bed to wedge out below the western part of the dome along a N-S striking structure; this structure is refered to as the „Salzhang“ (salt slope). West of the „Salzhang“, where subrosion has removed the salt bed that prevents gas rising from below, carbon dioxide of deep volcanic origin can now rise to the surface. Hydraulic cross sections illustrate the presence of extensive and deep-seated groundwater flow within the entire Pyrmont dome. While groundwater flow is directed vertically downwards in the ridges surrounding the dome, centripetal horizontal flow predominates the intermediate area. In the central part of the dome, groundwater rises to join the River Emmer, which is the main receiving water course in the central part of the eroded basin. The depth of the saltwater/freshwater interface is determinated by the weight of the superimposed freshwater body. Hydrochemical cross sections show the shape and position of the interface and document a certain degree of hydrochemical zonation of the gently mineralized fresh water. Genetic relationships between the two main water types and the hydrochemical zones of the freshwater body are discussed. The knowledge of the hydrogeological relationship in the Bad Pyrmont aquifer systems permits a spatially narrow coexistence of wells withdrawing groundwater for different purposes (medicinal, mineral, drinking and industrial water).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper deals with the lithostratigraphic structure of the Solling sequence (Lower Triassic, Middle Buntsandstein) in the area between the Weser river region in the west and the Thuringian Eichsfeld region in the east. Lithologic profile mapping and the gamma-ray logs of several boreholes and 40 exposures have been used to define the lithostratigraphic Classification of the Solling sequence, to mark the facies zones and to find the connection between Sediments of the Thuringian basin in the east and the Weser fault trough via the crest of the Eichsfeld-Altmark Ridge. Tectonically controlled movements of synsedimentary character are the reason for the extreme convergence within the Solling sequence and the extreme Stratigraphie gap at its base (Hardegsen unconformity, Trusheim 1961) in the region of the swells. The discussion also demonstrates the importance of fault bundles active during Triassic and responsible for the thickness pattem of the Solling sequence between the Weser fault trough and the Eichsfeld-Altmark Ridge. The largest Stratigraphie gap is present at the line Brehme (Ohm Mountains) - Beuren - Treffurt where the Solling sequence covers Av/cn/a-bearing layers of the Volpriehausen sequence. In paiticular the Ridge sequences prove the existence of a further erosion unconformity within the Solling sequence (Solling unconformity, Kunz 1965) below the Thuringian Chirotheriensandstein as found by Rohling (1986) in the North German basin at the Stratigraphie level of the Karlshafen layers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Relatório de estágio apresentado à Escola Superior de Educação de Paula Frassinetti para obtenção do grau de mestre em ensino do 1º e 2º ciclo do ensino básico