960 resultados para Multiple Sources
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
In the current economic scenario, it is important to the incessant search for improvements in production quality and also in reducing costs. The great competition and technological innovation makes customers stay more and more demanding and seek multiple sources of improvement in production. The work aimed to use the general desirability to optimize a process involving multiple answers in machining experiment. The process of cylindrical turning is one of the most common processes of metal cutting machining and involves several factors, in which will be analysed the best combination of the input factors in machining process, with variable response to surface roughness (Ra) and cutting length (Lc) that vary important answers to measure process efficiency and product quality. The method is a case study, since it involves a study of a tool well addressed in the literature. Data analysis was used in the process of doctoral thesis of Ricardo Penteado on the theme using metaheuristicas combined with different methods of bonding for the optimization of a turning process of multiple responses, then used the desirability and analysis tool. Joint optimization by desirability, the method proposed the following combination of input variables, variable cutting speed at 90 m/min ( -1 level), the breakthrough in 0, 12 mm/revol. ( -1 level), the machining depth should be in 1.6 mm (level 1), gum used must be the TP2500 ( -1 level), in abundant fluid (level 1) and laminated material (level 1) to the maximization of the cutting length (Lc) and minimization of roughness (Ra)
Resumo:
Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.
Resumo:
In the current economic scenario, it is important to the incessant search for improvements in production quality and also in reducing costs. The great competition and technological innovation makes customers stay more and more demanding and seek multiple sources of improvement in production. The work aimed to use the general desirability to optimize a process involving multiple answers in machining experiment. The process of cylindrical turning is one of the most common processes of metal cutting machining and involves several factors, in which will be analysed the best combination of the input factors in machining process, with variable response to surface roughness (Ra) and cutting length (Lc) that vary important answers to measure process efficiency and product quality. The method is a case study, since it involves a study of a tool well addressed in the literature. Data analysis was used in the process of doctoral thesis of Ricardo Penteado on the theme using metaheuristicas combined with different methods of bonding for the optimization of a turning process of multiple responses, then used the desirability and analysis tool. Joint optimization by desirability, the method proposed the following combination of input variables, variable cutting speed at 90 m/min ( -1 level), the breakthrough in 0, 12 mm/revol. ( -1 level), the machining depth should be in 1.6 mm (level 1), gum used must be the TP2500 ( -1 level), in abundant fluid (level 1) and laminated material (level 1) to the maximization of the cutting length (Lc) and minimization of roughness (Ra)
Resumo:
Glacigenic diamictite successions of the Macaubas Group are widespread in the western domain of the Aracuai orogen, east of the Sao Francisco craton (Brazil). Diamictites also occur on this craton and in the African counterpart of the Aracuai orogen, the West Congo belt. Detrital zircon grains from the matrix of diamictites and sandstones from the Macaubas Group were dated by the U-Pb SHRIMP technique. The geochronological study sets the maximum depositional age of the glacial diamictites at 900 Ma, and indicates multiple sources for the Macaubas basin with ages ranging from 900 to 2800 Ma. Sm-Nd T-DM model ages, determined on whole rock samples, range from 1.8 Ga to 2.5 Ga and get older up-section. Comparison of our data with those from the cratonic area suggest that these glacial deposits can be correlated to the Jequitai and Carrancas diamictites in the Sao Francisco craton, and to the Lower Mixtite Formation of the West Congolian Group, exposed in Africa. The 900-1000 Ma source is most probably represented by the Zadinian-Mayumbian volcanic rocks and related granites from the West Congo belt. However, one of the most voluminous sources, with ages in the 1.1-1.3 Ga interval, has not been detected in the Sao Francisco-Congo craton. Possible sources for these grains could occur elsewhere in Africa, or possibly from within the Brasilia Belt in western central Brazil. (C) 2011 International Association for Gondwana Research. Published by Elsevier B.V. All rights reserved.
Resumo:
Background. One of the phenomena observed in human aging is the progressive increase of a systemic inflammatory state, a condition referred to as “inflammaging”, negatively correlated with longevity. A prominent mediator of inflammation is the transcription factor NF-kB, that acts as key transcriptional regulator of many genes coding for pro-inflammatory cytokines. Many different signaling pathways activated by very diverse stimuli converge on NF-kB, resulting in a regulatory network characterized by high complexity. NF-kB signaling has been proposed to be responsible of inflammaging. Scope of this analysis is to provide a wider, systemic picture of such intricate signaling and interaction network: the NF-kB pathway interactome. Methods. The study has been carried out following a workflow for gathering information from literature as well as from several pathway and protein interactions databases, and for integrating and analyzing existing data and the relative reconstructed representations by using the available computational tools. Strong manual intervention has been necessarily used to integrate data from multiple sources into mathematically analyzable networks. The reconstruction of the NF-kB interactome pursued with this approach provides a starting point for a general view of the architecture and for a deeper analysis and understanding of this complex regulatory system. Results. A “core” and a “wider” NF-kB pathway interactome, consisting of 140 and 3146 proteins respectively, were reconstructed and analyzed through a mathematical, graph-theoretical approach. Among other interesting features, the topological characterization of the interactomes shows that a relevant number of interacting proteins are in turn products of genes that are controlled and regulated in their expression exactly by NF-kB transcription factors. These “feedback loops”, not always well-known, deserve deeper investigation since they may have a role in tuning the response and the output consequent to NF-kB pathway initiation, in regulating the intensity of the response, or its homeostasis and balance in order to make the functioning of such critical system more robust and reliable. This integrated view allows to shed light on the functional structure and on some of the crucial nodes of thet NF-kB transcription factors interactome. Conclusion. Framing structure and dynamics of the NF-kB interactome into a wider, systemic picture would be a significant step toward a better understanding of how NF-kB globally regulates diverse gene programs and phenotypes. This study represents a step towards a more complete and integrated view of the NF-kB signaling system.
Resumo:
In this thesis the use of widefield imaging techniques and VLBI observations with a limited number of antennas are explored. I present techniques to efficiently and accurately image extremely large UV datasets. Very large VLBI datasets must be reduced into multiple, smaller datasets if today’s imaging algorithms are to be used to image them. I present a procedure for accurately shifting the phase centre of a visibility dataset. This procedure has been thoroughly tested and found to be almost two orders of magnitude more accurate than existing techniques. Errors have been found at the level of one part in 1.1 million. These are unlikely to be measurable except in the very largest UV datasets. Results of a four-station VLBI observation of a field containing multiple sources are presented. A 13 gigapixel image was constructed to search for sources across the entire primary beam of the array by generating over 700 smaller UV datasets. The source 1320+299A was detected and its astrometric position with respect to the calibrator J1329+3154 is presented. Various techniques for phase calibration and imaging across this field are explored including using the detected source as an in-beam calibrator and peeling of distant confusing sources from VLBI visibility datasets. A range of issues pertaining to wide-field VLBI have been explored including; parameterising the wide-field performance of VLBI arrays; estimating the sensitivity across the primary beam both for homogeneous and heterogeneous arrays; applying techniques such as mosaicing and primary beam correction to VLBI observations; quantifying the effects of time-average and bandwidth smearing; and calibration and imaging of wide-field VLBI datasets. The performance of a computer cluster at the Istituto di Radioastronomia in Bologna has been characterised with regard to its ability to correlate using the DiFX software correlator. Using existing software it was possible to characterise the network speed particularly for MPI applications. The capabilities of the DiFX software correlator, running on this cluster, were measured for a range of observation parameters and were shown to be commensurate with the generic performance parameters measured. The feasibility of an Italian VLBI array has been explored, with discussion of the infrastructure required, the performance of such an array, possible collaborations, and science which could be achieved. Results from a 22 GHz calibrator survey are also presented. 21 out of 33 sources were detected on a single baseline between two Italian antennas (Medicina to Noto). The results and discussions presented in this thesis suggest that wide-field VLBI is a technique whose time has finally come. Prospects for exciting new science are discussed in the final chapter.
Resumo:
The international growing concern for the human exposure to magnetic fields generated by electric power lines has unavoidably led to imposing legal limits. Respecting these limits, implies being able to calculate easily and accurately the generated magnetic field also in complex configurations. Twisting of phase conductors is such a case. The consolidated exact and approximated theory regarding a single-circuit twisted three-phase power cable line has been reported along with the proposal of an innovative simplified formula obtained by means of an heuristic procedure. This formula, although being dramatically simpler, is proven to be a good approximation of the analytical formula and at the same time much more accurate than the approximated formula found in literature. The double-circuit twisted three-phase power cable line case has been studied following different approaches of increasing complexity and accuracy. In this framework, the effectiveness of the above-mentioned innovative formula is also examined. The experimental verification of the correctness of the twisted double-circuit theoretical analysis has permitted its extension to multiple-circuit twisted three-phase power cable lines. In addition, appropriate 2D and, in particularly, 3D numerical codes for simulating real existing overhead power lines for the calculation of the magnetic field in their vicinity have been created. Finally, an innovative ‘smart’ measurement and evaluation system of the magnetic field is being proposed, described and validated, which deals with the experimentally-based evaluation of the total magnetic field B generated by multiple sources in complex three-dimensional arrangements, carried out on the basis of the measurement of the three Cartesian field components and their correlation with the field currents via multilinear regression techniques. The ultimate goal is verifying that magnetic induction intensity is within the prescribed limits.
Resumo:
Body-centric communications are emerging as a new paradigm in the panorama of personal communications. Being concerned with human behaviour, they are suitable for a wide variety of applications. The advances in the miniaturization of portable devices to be placed on or around the body, foster the diffusion of these systems, where the human body is the key element defining communication characteristics. This thesis investigates the human impact on body-centric communications under its distinctive aspects. First of all, the unique propagation environment defined by the body is described through a scenario-based channel modeling approach, according to the communication scenario considered, i.e., on- or on- to off-body. The novelty introduced pertains to the description of radio channel features accounting for multiple sources of variability at the same time. Secondly, the importance of a proper channel characterisation is shown integrating the on-body channel model in a system level simulator, allowing a more realistic comparison of different Physical and Medium Access Control layer solutions. Finally, the structure of a comprehensive simulation framework for system performance evaluation is proposed. It aims at merging in one tool, mobility and social features typical of the human being, together with the propagation aspects, in a scenario where multiple users interact sharing space and resources.
Resumo:
Flüchtige organische Bestandteile (engl.: VOC) sind in der Atmosphäre in Spuren vorhanden, spielen aber trotzdem eine wichtige Rolle in der Luftchemie: sie beeinflussen das Ozon der Troposphäre, städtischen Smog, Oxidationskapazität und haben direkte und indirekte Auswirkungen auf die globale Klimaveränderung. Eine wichtige Klasse der VOC sind die Nicht-Methan-Kohlenwasserstoffe (engl.: NMHC), die überwiegend von anthropogenen Quellen kommen. Aus diesem Grund ist für Luftchemiker ein Messinstrument nötig, das die VOC, die NMHC eingeschlossen, mit einer höheren Zeitauflösung misst, besonders für Echtzeitmessungen an Bord eines Forschungsflugzeuges. Dafür wurde das System zur schnellen Beobachtung von organischen Spuren (engl.: FOTOS) entworfen, gebaut für den Einsatz in einem neuen Wissenschaftlichen Flugzeug, das in großen Höhen und über weite Strecken fliegt, genannt HALO. In der Folge wurde FOTOS in zwei Messkampagnen am Boden getestet. FOTOS wurde entworfen und gebaut mit einem speziell angefertigten, automatisierten, kryogenen Probensystem mit drei Fallen und einem angepassten, erworbenen schnellen GC-MS. Ziel dieses Aufbaus war es, die Vielseitigkeit zu vergrößern und das Störungspotential zu verringern, deshalb wurden keine chemischen Trocknungsmittel oder adsorbierenden Stoffe verwendet. FOTOS erreichte eine Probenfrequenz von 5.5 Minuten, während es mindestens 13 verschiedene C2- bis C5-NMHC maß. Die Drei-Sigma-Detektionsgrenze für n- und iso-Pentan wurde als 2.6 und 2.0 pptv ermittelt, in dieser Reihenfolge. Labortests bestätigten, dass FOTOS ein vielseitiges, robustes, hochautomatisiertes, präzises, genaues, empfindliches Instrument ist, geeignet für Echtzeitmessungen von VOC in Probenfrequenzen, die angemessen sind für ein Forschungsflugzeug wie HALO. Um die Leistung von FOTOS zu bestätigen, wurde vom 26. Januar bis 4. Februar 2010 ein Zwischenvergleich gemacht mit dem GC-FID-System am Meteorologischen Observatorium Hohenpeißenberg, einer WMO-GAW-globalen Station. Dreizehn verschiedene NMHC wurden innerhalb des Rahmens der GWA Data Quality Objectives (DQO) analysiert und verglichen. Mehr als 80% der Messungen von sechs C3- bis C5-NMHC erfüllten diese DQO. Diese erste Messkampagne im Feld hob die Robustheit und Messgenauigkeit von FOTOS hervor, zusätzlich zu dem Vorteil der höheren Probenfrequenz, sogar in einer Messung am Boden. Um die Möglichkeiten dieses Instrumentes im Feld zu zeigen, maß FOTOS ausgewählte leichte NMHC während einer Messkampagne im Borealen Waldgebiet, HUMPPA-COPEC 2010. Vom 12. Juli bis zum 12. August 2010 beteiligte sich eine internationale Gruppe von Instituten und Instrumenten an Messungen physikalischer und chemischer Größen der Gas- und Partikelphasen der Luft über dem Borealen Wald an der SMEAR II-Station nahe Hyyttiälä, Finnland. Es wurden mehrere Hauptpunkte von Interesse im Mischungsverhältnis der Alkane und im Isomerenverhätnis von Pentan identifiziert, insbesondere sehr unterschiedliche Perioden niedriger und hoher Variabilität, drei Rauchschwaden von Biomassen-Verbrennung von russischen Waldbränden und zwei Tage mit extrem sauberer Luft aus der Polarregion. Vergleiche der NMHC mit anderen anthropogenen Indikatoren zeigten mehrere Quellen anthropogener Einflüsse am Ort auf und erlaubten eine Unterscheidung zwischen lokalen und weiter entfernten Quellen. Auf einen minimalen natürlichen Beitrag zum 24h-Kreislauf von NOx wurde geschlussfolgert aus der Korrelation von NOx mit Alkanen. Altersschätzungen der Luftmassen durch das Isomerenverhältnis von Pentan wurden erschwert durch sich verändernde Verhältnisse der Quellen und durch Besonderheiten der Photochemie während des Sommers im hohen Norden. Diese Messungen zeigten den Wert des Messens leichter NMHC, selbst in abgelegenen Regionen, als einen zusätzlichen spezifischen Marker von anthropogenem Einfluss.
Resumo:
Java Enterprise Applications (JEAs) are large systems that integrate multiple technologies and programming languages. Transactions in JEAs simplify the development of code that deals with failure recovery and multi-user coordination by guaranteeing atomicity of sets of operations. The heterogeneous nature of JEAs, however, can obfuscate conceptual errors in the application code, and in particular can hide incorrect declarations of transaction scope. In this paper we present a technique to expose and analyze the application transaction scope in JEAs by merging and analyzing information from multiple sources. We also present several novel visualizations that aid in the analysis of transaction scope by highlighting anomalies in the specification of transactions and violations of architectural constraints. We have validated our approach on two versions of a large commercial case study.
Resumo:
The hydraulic fracturing of the Marcellus Formation creates a byproduct known as frac water. Five frac water samples were collected in Bradford County, PA. Inorganic chemical analysis, field parameters analysis, alkalinity titrations, total dissolved solids(TDS), total suspended solids (TSS), biological oxygen demand (BOD), and chemical oxygen demand (COD) were conducted on each sample to characterize frac water. A database of frac water chemistry results from across the state of Pennsylvania from multiple sources was compiled in order to provide the public and research communitywith an accurate characterization of frac water. Four geochemical models were created to model the reactions between frac water and the Marcellus Formation, Purcell Limestone, and the oil field brines presumed present in the formations. The average concentrations of chloride and TDS in the five frac water samples were 1.1 �± 0.5 x 105 mg/L (5.5X average seawater) and 140,000 mg/L (4X average seawater). BOD values for frac water immediately upon flow back were over 10X greater than the BOD of typical wastewater, but decreased into the range of typical wastewater after a short period of time. The COD of frac water decreases dramatically with an increase in elapsed time from flow back, but remain considerably higher than typicalwastewater. Different alkalinity calculation methods produced a range of alkalinity values for frac water: this result is most likely due to high concentrations of aliphatic acid anions present in the samples. Laboratory analyses indicate that the frac watercomposition is quite variable depending on the companies from which the water was collected, the geology of the local area, and number of fracturing jobs in which the frac water was used, but will require more treatment than typical wastewater regardless of theprecise composition of each sample. The geochemical models created suggest that the presence of organic complexes in an oil field brine and Marcellus Formation aid in the dissolution of ions such as bariumand strontium into the solution. Although equilibration reactions between the Marcellus Formation and the slickwater account for some of the final frac water composition, the predominant control of frac water composition appears to be the ratio of the mixture between the oil field brine and slickwater. The high concentration of barium in the frac water is likely due to the abundance of barite nodules in the Purcell Limestone, and the lack of sulfate in the frac water samples is due to the reducing, anoxic conditions in the earth's subsurface that allow for the degassing of H2S(g).
Resumo:
Several elements influence the meanings of work: the basic psychological processes of aging; the cohort or generation of the worker; the ecology of the work itself; and the larger social context of managing the risks of aging. This article discusses the meaning of work across the lifespan, and then reviews each of these elements to describe the meanings of work for older workers. The authors summarize data from multiple sources to answer several related questions: Why do older workers continue to work—beyond the solely monetary motivation? How do older workers' meanings of work vary by financial, health, job satisfaction, familial, or workplace concerns? What are the implications of these findings for employers and employees?
Resumo:
We present a Rare Earth Elements (REE) record determined on the EPICA ice core drilled at Dronning Maud Land (EDML) in the Atlantic sector of the East Antarctic Plateau. The record covers the transition from the last glacial stage (LGS) to the early Holocene (26 600–7500 yr BP) at decadal to centennial resolution. Additionally, samples from potential source areas (PSAs) for Antarctic dust were analyzed for their REE characteristics. The dust provenance is discussed by comparing the REE fingerprints in the ice core and the PSA samples. We find a shift in variability in REE composition at ~15 000 yr BP in the ice core samples. Before 15 000 yr BP, the dust composition is very uniform and its provenance was most certainly dominated by a South American source. After 15 000 yr BP, multiple sources such as Australia and New Zealand become relatively more important, although South America remains the major dust source. A similar change in the dust characteristics was observed in the EPICA Dome C ice core at around ~15 000 yr BP, accompanied by a shift in the REE composition, thus suggesting a change of atmospheric circulation in the Southern Hemisphere.
Resumo:
Timing divergence events allow us to infer the conditions under which biodiversity has evolved and gain important insights into the mechanisms driving evolution. Cichlid fishes are a model system for studying speciation and adaptive radiation, yet, we have lacked reliable timescales for their evolution. Phylogenetic reconstructions are consistent with cichlid origins prior to Gondwanan landmass fragmentation 121-165 MYA, considerably earlier than the first known fossil cichlids (Eocene). We examined the timing of cichlid evolution using a relaxed molecular clock calibrated with geological estimates for the ages of 1) Gondwanan fragmentation and 2) cichlid fossils. Timescales of cichlid evolution derived from fossil-dated phylogenies of other bony fishes most closely matched those suggested by Gondwanan breakup calibrations, suggesting the Eocene origins and marine dispersal implied by the cichlid fossil record may be due to its incompleteness. Using Gondwanan calibrations, we found accumulation of genetic diversity within the radiating lineages of the African Lakes Malawi, Victoria and Barombi Mbo, and Palaeolake Makgadikgadi began around or after the time of lake basin formation. These calibrations also suggest Lake Tanganyika was colonized independently by the major radiating cichlid tribes that then began to accumulate genetic diversity thereafter. These results contrast with the widely accepted theory that diversification into major lineages took place within the Tanganyika basin. Together, this evidence suggests that ancient lake habitats have played a key role in generating and maintaining diversity within radiating lineages and also that lakes may have captured preexisting cichlid diversity from multiple sources from which adaptive radiations have evolved.