938 resultados para Hattori chart
Resumo:
本文给出了一种针对公路自动收费系统的车辆分离检测而设计的,利用红外线发射和接收所形成的光幕,完全消除跟车现象,并能将半挂车、全挂车、单车可靠分离的红外车辆分离器。高可靠性红外车辆分离器的设计很好克服了传统红外车辆分离器环境适应性差、故障率高等缺点。本文还介绍了光幕形成的基本原理,并阐述了硬件关键点及实现方法,给出了系统硬件原理图、软件基本设计思想。
Resumo:
(1) I research on the relationship between elastic parameters, lithology and liquid. It is a physical base for pre-stack seismic inversion. I research all kinds of approximate expressions of Zoeppritz function. Then the relation of all kinds of approximate expressions can be confirmed. The geological model of water sand and gas sand in different depth was designed. Moreover I research on precision of all kinds of approximate expressions. (2) In process of seismic data which aim at amplitude recovery and apply in pre-stack seismic inversion, I advance to adopt double flow chart for different aim. Pre-stack noise elimination, real amplitude recovery and NMO correction of long offset are the key taches. (3) I made a systemic expatiate for the thinking and applicability about all kinds of expressions of elastic impedance. And mathematical model was applied to compare the precision with all kinds of expressions of elastic impedance. I propose a new pre-stack simultaneous inversion which is based on the Zoppritz function and simulated annealing algorithm. This method can ensure calculation precision of reflection coefficient from different incident angle and get a global optimum solution. Therefore this method improves the precision of pre-stack seismic inversion. (4) The object function of P-S wave pre-stack simultaneous inversion was established. I compared the precision and convergence between simultaneous inversion and P-wave inversion. And the results show that simultaneous inversion is superior to P-wave inversion. Through the study of AVO event of transformed wave, AVO characters of different kinds of gas sand were analyzed. (5) I carried out the study work of pre-stack seismic inversion for carbonate reservoir in middle of Tarim basin and sand shale reservoir in Sulige Area of Erdos Basin. The method and technology in this paper was applied to practical work. And I made a prediction for heterogeneous reservoir. Moreover it acquires a good application effect. Key Word: reflection coefficient, amplitude recovery, pre-stack seismic inversion, Heterogeneous reservoir,prediction.
Resumo:
Macro-distribution of residual basins is a basic question in residual basin research,the main object of macro-distribution study is to build strata framework, compute thickness of residual strata and analyze characteristics of residual basins. With the guidance of the theory of integrated geology and geophysical research, the paper assembled series of methods and established the technical chart based on gravity and magnetic data, with restriction of geology, seismic and drilling data. Based on potential field data processing and analysis, forward and inverse computation, region potential field analysis and potential field separation, etc. it computed depth of gravity/magnetic basement and got strata framework. It had got effective results in the research of macro-distribution of residual basin research in the Dagang area. It did the wavelet transform of gravity/magnetic data with multi-kind of wavelet basis using a trou algorithm. From comparison of processing result and their spectral of wavelet analysis, up continuation and filter method, the wavelet approximation is better to fit the regional potential field, and it is an effective method to separate gravity/magnetic effect caused by deep geology bodies. The experiment of matching pursuit shows that te transform domain methods have great advantage in potential data analysis. From the integrated geophysical study of rock property study, gravity/magnetic basement inversion and fault system analysis of the Dagang area, it gets the strata framework and the thickness of pre-Cenozoic residual strata. Comprehensive study with gravity and magnetotelluric profile inversion and interpretation, three prospect plays of macro-distribution of residual basins are fingered out. It has great residual strata thickness in the northern part of Chengning Uplift and there is thrust fault in the deep zone and good up-Paleozoic hydrocarbon source rocks in this area. With integrated analysis, this area will be the most prospective hydrocarbon location of pre-Cenozoic residual basins.
Resumo:
The sediment and diagenesis process of reservoir are the key controlling factors for the formation and distribution of hydrocarbon reservoir. For quite a long time, most of the research on sediment-diagenesis facise is mainly focusing on qualitative analysis. With the further development on exploration of oil field, the qualitative analysis alone can’t meet the requirements of complicated requirements of oil and gas exploreation, so the quantitative analysis of sediment-diagenesis facise and related facies modling have become more and more important. On the basis of the research result from stratum and sediment on GuLong Area Putaohua Oil Layer Group, from the basic principles of sedimentology, and with the support from the research result from field core and mining research results, the thesis mainly makes the research on the sediment types, the space framework of sands and the evolution rules of diagenesis while mainly sticking to the research on sediment systement analysis and diagenetic deformation, and make further quantitative classification on sediment-diageneses facies qualitatively, discussed the new way to divide the sediment-diagenesis facies, and offer new basis for reservoir exploration by the research. Through using statistics theory including factor analysis, cluster analysis and discriminant analysis, the thesis devided sediment-diagenesis facies quantitatively. This research method is innovative on studying sediment-diagenesis facies. Firstly, the factor analysis could study the main mechanism of those correlative variables in geologic body, and then could draw a conclusion on the control factors of fluid and capability of reservoir in the layer of studying area. Secondly, with the selected main parameter for the cluster analysis, the classification of diagenesis is mainly based on the data analysis, thus the subjective judgement from the investigator could be eliminated, besides the results could be more quantitative, which is helpful to the correlative statistical analysis, so one could get further study on the quantitative relations of each sediment-diagenesis facies type. Finally, with the reliablities of discriminant analysis cluster results, and the adoption of discriminant probability to formulate the chart, the thesis could reflect chorisogram of sediment-diagenesis facies for planar analysis, which leads to a more dependable analytic results.According to the research, with the multi-statistics analysis methods combinations, we could get quantitative analysis on sediment-diagenesis facies of reservoir, and the final result could be more reliable and also have better operability.
Resumo:
The primary approaches for people to understand the inner properties of the earth and the distribution of the mineral resources are mainly coming from surface geology survey and geophysical/geochemical data inversion and interpretation. The purpose of seismic inversion is to extract information of the subsurface stratum geometrical structures and the distribution of material properties from seismic wave which is used for resource prospecting, exploitation and the study for inner structure of the earth and its dynamic process. Although the study of seismic parameter inversion has achieved a lot since 1950s, some problems are still persisting when applying in real data due to their nonlinearity and ill-posedness. Most inversion methods we use to invert geophysical parameters are based on iterative inversion which depends largely on the initial model and constraint conditions. It would be difficult to obtain a believable result when taking into consideration different factors such as environmental and equipment noise that exist in seismic wave excitation, propagation and acquisition. The seismic inversion based on real data is a typical nonlinear problem, which means most of their objective functions are multi-minimum. It makes them formidable to be solved using commonly used methods such as general-linearization and quasi-linearization inversion because of local convergence. Global nonlinear search methods which do not rely heavily on the initial model seem more promising, but the amount of computation required for real data process is unacceptable. In order to solve those problems mentioned above, this paper addresses a kind of global nonlinear inversion method which brings Quantum Monte Carlo (QMC) method into geophysical inverse problems. QMC has been used as an effective numerical method to study quantum many-body system which is often governed by Schrödinger equation. This method can be categorized into zero temperature method and finite temperature method. This paper is subdivided into four parts. In the first one, we briefly review the theory of QMC method and find out the connections with geophysical nonlinear inversion, and then give the flow chart of the algorithm. In the second part, we apply four QMC inverse methods in 1D wave equation impedance inversion and generally compare their results with convergence rate and accuracy. The feasibility, stability, and anti-noise capacity of the algorithms are also discussed within this chapter. Numerical results demonstrate that it is possible to solve geophysical nonlinear inversion and other nonlinear optimization problems by means of QMC method. They are also showing that Green’s function Monte Carlo (GFMC) and diffusion Monte Carlo (DMC) are more applicable than Path Integral Monte Carlo (PIMC) and Variational Monte Carlo (VMC) in real data. The third part provides the parallel version of serial QMC algorithms which are applied in a 2D acoustic velocity inversion and real seismic data processing and further discusses these algorithms’ globality and anti-noise capacity. The inverted results show the robustness of these algorithms which make them feasible to be used in 2D inversion and real data processing. The parallel inversion algorithms in this chapter are also applicable in other optimization. Finally, some useful conclusions are obtained in the last section. The analysis and comparison of the results indicate that it is successful to bring QMC into geophysical inversion. QMC is a kind of nonlinear inversion method which guarantees stability, efficiency and anti-noise. The most appealing property is that it does not rely heavily on the initial model and can be suited to nonlinear and multi-minimum geophysical inverse problems. This method can also be used in other filed regarding nonlinear optimization.
Resumo:
Multi-waves and multi-component get more and more attentions from oil industry. On the basis of existent research results, My research focuses on some key steps of OBC 4C datum processing. OBC datum must be preprocessed quite well for getting a good image. We show a flow chart of preprocess including attenuation of noise on multi-component datum、elimination ghost by summing P and Z and rotation of horizontal components. This is a good foundation for the coming steps about OBC processing. How to get exact converted point location and to analyze velocity are key points in processing reflection seismic converted wave data. This paper includes computing converted point location, analyzing velocity and nonhyperbolic moveout about converted waves. Anisotropic affects deeply the location of converted wave and the nonhyperbolic moveout. Supposed VTI, we research anisotropic effect on converted wave location and the moveout. Since Vp/Vs is important, we research the compute method of Vp/Vs from post-stack data and pre-stack data. It is a part of the paper that inversing anisotropic parameter by traveltime. Pre-stack time migration of converted wave is an focus, using common-offset Kirchhoff migration, we research the velocity model updating in anisotropic media. I have achieved the following results: 1) using continued Fractions, we proposed a new converted point approximate equation, when the offset is long enough ,the thomsen’s 2 order equation can’t approximate to the exact location of converted point, our equation is a good approximate for the exact location. 2) our new methods about scanning nonhyperbolic velocity and Vp/Vs can get a high quality energy spectrum. And the new moveout can fit the middle and long offset events. Processing the field data get a good result. 3) a new moveout equation, which have the same form as Alkhalifah’s long offset P wave moveout equation, have the same degree preciseness as thomsen’s moveout equation by testing model data. 4) using c as a function of the ratio offset to depth, we can uniform the Li’s and thomsen’s moveout equation in a same equation, the model test tell us choice the reasonable function C can improve the exact degree of Li’s and thomsen’s equation. 5) using traveltime inversion ,we can get anisotropic parameter, which can help to flat the large offset event and propose a model of anisotropic parameter which will useful for converted wave pre-stack time migration in anisotropic media. 6)using our pre-stack time migration method and flow, we can update the velocity model and anisotropic parameter model then get good image. Key words: OBC, Common converted Point (CCP), Nonhyperbolic moveout equation, Normal moveout correction, Velocity analysis, Anisotropic parameters inversion, Kirchhoff anisotropic pre-stack time migration, migration velocity model updating
Resumo:
BACKGROUND: Injuries represent a significant and growing public health concern in the developing world, yet their impact on patients and the emergency health-care system in the countries of East Africa has received limited attention. This study evaluates the magnitude and scope of injury related disorders in the population presenting to a referral hospital emergency department in northern Tanzania. METHODS: A retrospective chart review of patients presenting to the emergency department at Kilimanjaro Christian Medical Centre was performed. A standardized data collection form was used for data abstraction from the emergency department logbook and the complete medical record for all injured patients. Patient demographics, mechanism of injury, location, type and outcomes were recorded. RESULTS: Ten thousand six hundred twenty-two patients presented to the emergency department for evaluation and treatment during the 7-month study period. One thousand two hundred twenty-four patients (11.5%) had injuries. Males and individuals aged 15 to 44 years were most frequently injured, representing 73.4% and 57.8%, respectively. Road traffic injuries were the most common mechanism of injury, representing 43.9% of injuries. Head injuries (36.5%) and extremity injuries (59.5%) were the most common location of injury. The majority of injured patients, 59.3%, were admitted from the emergency department to the hospital wards, and 5.6%, required admission to an intensive care unit. Death occurred in 5.4% of injured patients. CONCLUSIONS: These data give a detailed and more robust picture of the patient demographics, mechanisms of injury, types of injury and patient outcomes from similar resource-limited settings.
Resumo:
PURPOSE: The endoplasmic reticulum-associated degradation pathway is responsible for the translocation of misfolded proteins across the endoplasmic reticulum membrane into the cytosol for subsequent degradation by the proteasome. To define the phenotype associated with a novel inherited disorder of cytosolic endoplasmic reticulum-associated degradation pathway dysfunction, we studied a series of eight patients with deficiency of N-glycanase 1. METHODS: Whole-genome, whole-exome, or standard Sanger sequencing techniques were employed. Retrospective chart reviews were performed in order to obtain clinical data. RESULTS: All patients had global developmental delay, a movement disorder, and hypotonia. Other common findings included hypolacrima or alacrima (7/8), elevated liver transaminases (6/7), microcephaly (6/8), diminished reflexes (6/8), hepatocyte cytoplasmic storage material or vacuolization (5/6), and seizures (4/8). The nonsense mutation c.1201A>T (p.R401X) was the most common deleterious allele. CONCLUSION: NGLY1 deficiency is a novel autosomal recessive disorder of the endoplasmic reticulum-associated degradation pathway associated with neurological dysfunction, abnormal tear production, and liver disease. The majority of patients detected to date carry a specific nonsense mutation that appears to be associated with severe disease. The phenotypic spectrum is likely to enlarge as cases with a broader range of mutations are detected.
Resumo:
BACKGROUND: In Tanzania, HIV-1 RNA testing is rarely available and not standard of care. Determining virologic failure is challenging and resistance mutations accumulate, thereby compromising second-line therapy. We evaluated durability of antiretroviral therapy (ART) and predictors of virologic failure among a pediatric cohort at four-year follow-up. METHODS: This was a prospective cross-sectional study with retrospective chart review evaluating a perinatally HIV-infected Tanzanian cohort enrolled in 2008-09 with repeat HIV-1 RNA in 2012-13. Demographic, clinical, and laboratory data were extracted from charts, resistance mutations from 2008-9 were analyzed, and prospective HIV RNA was obtained. RESULTS: 161 (78%) participants of the original cohort consented to repeat HIV RNA. The average age was 12.2 years (55% adolescents ≥12 years). Average time on ART was 6.4 years with 41% receiving second-line (protease inhibitor based) therapy. Among those originally suppressed on a first-line (non-nucleoside reverse transcriptase based regimen) 76% remained suppressed. Of those originally failing first-line, 88% were switched to second-line and 72% have suppressed virus. Increased level of viremia and duration of ART trended with an increased number of thymidine analogue mutations (TAMs). Increased TAMs increased the odds of virologic failure (p = 0.18), as did adolescent age (p < 0.01). CONCLUSIONS: After viral load testing in 2008-09 many participants switched to second-line therapy. The majority achieved virologic suppression despite multiple resistance mutations. Though virologic testing would likely hasten the switch to second-line among those failing, methods to improve adherence is critical to maximize durability of ART and improve virologic outcomes among youth in resource-limited settings.
Resumo:
The purpose of this study was to identify the preoperative predictors of hospital length of stay after primary total knee arthroplasty in a patient population reflecting current trends toward shorter hospitalization and using readily obtainable factors that do not require scoring systems. A single-center, multi-surgeon retrospective chart review of two hundred and sixty consecutive patients who underwent primary total knee arthroplasty was performed. The mean length of stay was 3.0 days. Among the different variables studied, increasing comorbidities, lack of adequate assistance at home, and bilateral surgery were the only multivariable significant predictors of longer length of stay. The study was adequately powered for statistical analyses and the concordance index of the multivariable logistic regression model was 0.815.
Resumo:
PURPOSE: Risk-stratified guidelines can improve quality of care and cost-effectiveness, but their uptake in primary care has been limited. MeTree, a Web-based, patient-facing risk-assessment and clinical decision support tool, is designed to facilitate uptake of risk-stratified guidelines. METHODS: A hybrid implementation-effectiveness trial of three clinics (two intervention, one control). PARTICIPANTS: consentable nonadopted adults with upcoming appointments. PRIMARY OUTCOME: agreement between patient risk level and risk management for those meeting evidence-based criteria for increased-risk risk-management strategies (increased risk) and those who do not (average risk) before MeTree and after. MEASURES: chart abstraction was used to identify risk management related to colon, breast, and ovarian cancer, hereditary cancer, and thrombosis. RESULTS: Participants = 488, female = 284 (58.2%), white = 411 (85.7%), mean age = 58.7 (SD = 12.3). Agreement between risk management and risk level for all conditions for each participant, except for colon cancer, which was limited to those <50 years of age, was (i) 1.1% (N = 2/174) for the increased-risk group before MeTree and 16.1% (N = 28/174) after and (ii) 99.2% (N = 2,125/2,142) for the average-risk group before MeTree and 99.5% (N = 2,131/2,142) after. Of those receiving increased-risk risk-management strategies at baseline, 10.5% (N = 2/19) met criteria for increased risk. After MeTree, 80.7% (N = 46/57) met criteria. CONCLUSION: MeTree integration into primary care can improve uptake of risk-stratified guidelines and potentially reduce "overuse" and "underuse" of increased-risk services.Genet Med 18 10, 1020-1028.
Resumo:
p.219-225
Resumo:
The historic pattern of public sector pay movements in the UK has been counter-cyclical with private sector pay growth. Periods of relative decline in public sector pay against private sector movements have been followed by periods of ‘catch-up’ as Government controls are eased to remedy skill shortages or deal with industrial unrest among public servants. Public sector ‘catch up’ increases have therefore come at awkward times for Government, often coinciding with economic downturn in the private sector (Trinder 1994, White 1996, Bach 2002). Several such epochs of public sector pay policy can be identified since the 1970s. The question is whether the current limits on public sector pay being imposed by the UK Government fit this historic pattern or whether the pattern has been broken and, if so, how and why? This paper takes a historical approach in considering the context to public sector pay determination in the UK. In particular the paper seeks to review the period since Labour came into office (White and Hatchett 2003) and the various pay ‘modernisation’ exercises that have been in process over the last decade (White 2004). The paper draws on national statistics on public sector employment and pay levels to chart changes in public sector pay policy and draws on secondary literature to consider both Government policy intentions and the impact of these policies for public servants.
Resumo:
During the 1970’s and 1980’s, the late Dr Norman Holme undertook extensive towed sledge surveys in the English Channel and some in the Irish Sea. Only a minority of the resulting images were analysed and reported before his death in 1989 but logbooks, video and film material has been archived in the National Marine Biological Library (NMBL) in Plymouth. A scoping study was therefore commissioned by the Joint Nature Conservation Committee and as a part of the Mapping European Seabed Habitats (MESH) project to identify the value of the material archived and the procedure and cost to undertake further work. The results of the scoping study are: 1. NMBL archives hold 106 videotapes (reel-to-reel Sony HD format) and 59 video cassettes (including 15 from the Irish Sea) in VHS format together with 90 rolls of 35 mm colour transparency film (various lengths up to about 240 frames per film). These are stored in the Archive Room, either in a storage cabinet or in original film canisters. 2. Reel-to-reel material is extensive and had already been selectively copied to VHS cassettes. The cost of transferring it to an accepted ‘long-life’ medium (Betamax) would be approximately £15,000. It was not possible to view the tapes as a suitable machine was not located. The value of the tapes is uncertain but they are likely to become beyond salvation within one to two years. 3. Video cassette material is in good condition and is expected to remain so for several more years at least. Images viewed were generally of poor quality and the speed of tow often makes pictures blurred. No immediate action is required. 4. Colour transparency films are in good condition and the images are very clear. They provide the best source of information for mapping seabed biotopes. They should be scanned to digital format but inexpensive fast copying is problematic as there are no between-frame breaks between images and machines need to centre the image based on between-frame breaks. The minimum cost to scan all of the images commercially is approximately £6,000 and could be as much as £40,000 on some quotations. There is a further cost in coding and databasing each image and, all-in-all it would seem most economic to purchase a ‘continuous film’ scanner and undertake the work in-house. 5. Positional information in ships logs has been matched to films and to video tapes. Decca Chain co-ordinates recorded in the logbooks have been converted to latitude and longitude (degrees, minutes and seconds) and a further routine developed to convert to degrees and decimal degrees required for GIS mapping. However, it is unclear whether corrections to Decca positions were applied at the time the position was noted. Tow tracks have been mapped onto an electronic copy of a Hydrographic Office chart. 6. The positions of start and end of each tow were entered to a spread sheet so that they can be displayed on GIS or on a Hydrographic Office Chart backdrop. The cost of the Hydrographic Office chart backdrop at a scale of 1:75,000 for the whole area was £458 incl. VAT. 7. Viewing all of the video cassettes to note habitats and biological communities, even by an experienced marine biologist, would take at least in the order of 200 hours and is not recommended. English Channel towed sledge seabed images. Phase 1: scoping study and example analysis. 6 8. Once colour transparencies are scanned and indexed, viewing to identify seabed habitats and biological communities would probably take about 100 hours for an experienced marine biologist and is recommended. 9. It is expected that identifying biotopes along approximately 1 km lengths of each tow would be feasible although uncertainties about Decca co-ordinate corrections and exact positions of images most likely gives a ±250 m position error. More work to locate each image accurately and solve the Decca correction question would improve accuracy of image location. 10. Using codings (produced by Holme to identify different seabed types), and some viewing of video and transparency material, 10 biotopes have been identified, although more would be added as a result of full analysis. 11. Using the data available from the Holme archive, it is possible to populate various fields within the Marine Recorder database. The overall ‘survey’ will be ‘English Channel towed video sled survey’. The ‘events’ become the 104 tows. Each tow could be described as four samples, i.e. the start and end of the tow and two areas in the middle to give examples along the length of the tow. These samples would have their own latitude/longitude co-ordinates. The four samples would link to a GIS map. 12. Stills and video clips together with text information could be incorporated into a multimedia presentation, to demonstrate the range of level seabed types found along a part of the northern English Channel. More recent images taken during SCUBA diving of reef habitats in the same area as the towed sledge surveys could be added to the Holme images.
Resumo:
This study addresses the long-term stability of three trophic groupings in the Northeast Atlantic at regional scales. The most abundant taxa representing phytoplankton, herbivorous copepods, and carnivorous zooplankton were examined from the Continuous Plankton Recorder database. Multivariate control charts using a Bray–Curtis similarity metric were used to assess whether fluctuations within trophic groupings were within or beyond the expected variability. Two evaluation periods were examined: annual changes between 1960 and 1999 (2000–2009 baseline) and recent changes between 2000 and 2009 (1960–1999 baseline). The trends over time in abundance/biomass of trophic levels were region-specific, especially in carnivorous copepods, where abundance did not mirror trends in the overall study area. The stability of phytoplankton was within the expected limits, although not in 2008 and 2009. Higher trophic levels were less stable, perhaps reflecting the added complexity of interactions governing their abundance. In addition, some regions were consistently less stable than others. Correlations in stability between adjacent trophic levels were positive at large marine ecosystem scale but generally non-significant at regional scales. The study suggests that certain regions may be particularly vulnerable to periods of instability in community structure. The benefits of using the control chart method rather than other multivariate measures of plankton dynamics are discussed.