939 resultados para technology-based


Relevância:

60.00% 60.00%

Publicador:

Resumo:

本文介绍了基于工业以太网的载人潜水器的信息显示与存储技术。详细地描述了该系统的软件结构及各相关模块的特点,并对系统中数据信息流的传递显示与存储做了详细的介绍。目前,该系统已在水池中进行了实验,实验效果良好。

Relevância:

60.00% 60.00%

Publicador:

Resumo:

利用反共振原理可有效减小振动机械对基础的作用,提高振动机械的寿命。建立了原点反共振振动机的动力学模型,阐明了其工作原理。以工作体和下质体振幅稳定为核心,分析了各系统参数一定时质量比和反共振频率比对系统振幅稳定性的影响情况,得到了作为组合参数时质量比和反共振频率比分别与上下质体动力放大因子的关系曲面,由此可得到满足工艺要求并能保证振幅稳定的参数区间,为各类反共振振动机设计提供了重要依据。研究了物料质量波动对系统振幅稳定性的影响和反共振点的漂移情况,揭示了对反共振机激振频率进行控制的必要性。在合理动力学参数组合的前提下通过引入控制技术,有效地提高了反共振振动机的工作机体和下质体的振幅稳定性。

Relevância:

60.00% 60.00%

Publicador:

Resumo:

异构分布计算环境中多数据库的集成与互操作是CIMS环境下信息集成的一个主要研究领域。本文在分析这方面研究成果的基础上,结合CIMS环境下信息集成的需求,提出了一种基于视图对象(ViewObject)机制的多库集成与互操作技术,其基本思想是通过视图对象的机制对多种数据源中的数据进行多层抽象与封装,以满足CIMS环境下不同层次的应用集成需求。文中还以公共对象请求代理结构(CORBA)为基础提出了基于对象请求代理(ORB)的多数据库集成与互操作系统VO-MDBS的实现机制。

Relevância:

60.00% 60.00%

Publicador:

Resumo:

为了实现轴类零件的修复,搭建基于激光熔覆技术的绿色再制造系统,该系统由功率为6kW的CO2激光器、四轴工作台、送粉机构、数控系统等硬件设备和激光绿色再制造系统驱动软件组成。以轴类为典型零件、Ni60A合金粉末为熔覆材料,对激光绿色再制造工艺技术进行研究。通过研究分析激光功率、熔覆速度、送粉量、熔覆间距等主要参数对熔覆层高度、宽度和熔覆层质量等的影响情况,获得了轴类零件激光修复的最佳工艺参数组合,实现基于激光熔覆的绿色再制造技术在生产实践中的应用。

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Exploration study proves that East sea shelf basin embeds abundant hydrocarbon resources. However, the exploration knowledge of this area is very low. Many problems in exploration are encountered here. One of them is that the gas reservoir of this area, with rapid lateral variation, is deeply buried. Correlation of Impendence between sandstone, gas sand and shale is very poor. Another problem is that the S/N ratio of the seismic data is very low and multiples are relatively productive which seriously affect reservoir identification. Resolution of the seismic data reflected from 2500-3000 meter is rather low, which seriously affects the application of hydrocarbon direct identification (HDI) technology. This research established a fine geological & geophysical model based on drilling、well logging、geology&seismic data of East sea Lishui area. A Q value extraction method from seismic data is proposed. With this method, Q value inversion from VSP data and seismic data is performed to determine the subsurface absorption of this area. Then wave propagation and absorption rule are in control. Field acquisition design can be directed. And at the same time, with the optimization of source system, the performance of high resolution seismic acquisition layout system is enhanced. So the firm foundation is ensured for east sea gas reservoir exploration. For solving the multiple and amplitude preserving problems during the seismic data processing, wave equation pre-stack amplitude preservation migration and wave equation feedback iteratively multiple attenuation technologies are developed. Amplitude preservation migration technology can preserve the amplitude of imaging condition and wave-field extrapolation. Multiple removing technology is independent of seismic source wavelet and velocity model, which avoiding the weakness of Delft method. Aiming at the complicated formation condition of the gas reservoir in this area, with dissecting typical hydrocarbon reservoir, a series of pertinent advanced gas reservoir seismic identification technologies such as petrophysical properties analyzing and seismic modeling technology、pre-stack/post-stack joint elastic inversion, attribute extraction technology based on seismic non-stationary signal theory and formation absorption characteristic and so on are studied and developed. Integrated analysis of pre-stack/post-stack seismic data, reservoir information, rock physics and attribute information is performed. And finally, a suit of gas reservoir identification technology is built, according to the geological and geophysical characteristics of this area. With developed innovative technologies, practical application and intergrated interpretation appraisal researches are carried out in Lishui 36-1.The validity of these technologies is tested and verified. Also the hydrocarbon charging possibility and position of those three east sea gas exploration targets are clearly pointed out.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hydrocarbon migration and accumulation are the important process to form reservoirs in sedimentary basins, and their researches are usually very difficult to be done in petroleum geology. In this paper, the west segment of northern margin of the Qaidam Basin was selected as study area. The concept of fault open coefficient, that combines multi-factors dealing with fault sealing, was applied to estimate semi-quantitatively the sealing characteristics of six faults which were considered controlling the hydrocarbon migration and accumulation. The data from boreholes were investigated to appraise the permeable characteristics of lithology combinations upon and beneath the unconformity surface. The result suggests that the basal conglomerates consist frequently the carriers. The data from boreholes and outcrops were collected to describe the sand carrier system. In order to eliminate the influence of inverse activities of the basin that made the formations be very steep, author adopts the phase method to build the basin models: for the steps before Pliocene the recovered true thickness maps were used to build the basin block; for the steps after Pliocene, the structure maps of today were used to build the basin block. During the modeling process, the results were calibrated by various measured data . the modeled results includes the dynamic evolvement course of trap form phase, vitrinite reflectance mature, the source rock expelled hydrocarbon intensity and fluid potential and petroleum plays. Author integrates the source rock expelled hydrocarbon intensity, fluid potential and carrier system and apply the migration technology based on percolation theory to simulate the oil and gas migration and accumulation course in the main accumulation times. The dominant pathways of oil and gas may show clearly the prospect distribution. Based on the hydrocarbon migration characteristics, the main control factors were synthesized, that including the effective source rock distribution, the match relationship of structural trap forming and hydrocarbon expelling from source rocks, the unconformity of Mesozoic and Cenozoic, the structures and the faults movement at Quaternary Finally, the author figures out the prospect plays in the study area.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ordos Basin is a typical cratonic petroliferous basin with 40 oil-gas bearing bed sets. It is featured as stable multicycle sedimentation, gentle formation, and less structures. The reservoir beds in Upper Paleozoic and Mesozoicare are mainly low density, low permeability, strong lateral change, and strong vertical heterogeneous. The well-known Loess Plateau in the southern area and Maowusu Desert, Kubuqi Desert and Ordos Grasslands in the northern area cover the basin, so seismic data acquisition in this area is very difficult and the data often takes on inadequate precision, strong interference, low signal-noise ratio, and low resolution. Because of the complicated condition of the surface and the underground, it is very difficult to distinguish the thin beds and study the land facies high-resolution lithologic sequence stratigraphy according to routine seismic profile. Therefore, a method, which have clearly physical significance, based on advanced mathematical physics theory and algorithmic and can improve the precision of the detection on the thin sand-peat interbed configurations of land facies, is in demand to put forward.Generalized S Transform (GST) processing method provides a new method of phase space analysis for seismic data. Compared with wavelet transform, both of them have very good localization characteristics; however, directly related to the Fourier spectra, GST has clearer physical significance, moreover, GST adopts a technology to best approach seismic wavelets and transforms the seismic data into time-scale domain, and breaks through the limit of the fixed wavelet in S transform, so GST has extensive adaptability. Based on tracing the development of the ideas and theories from wavelet transform, S transform to GST, we studied how to improve the precision of the detection on the thin stratum by GST.Noise has strong influence on sequence detecting in GST, especially in the low signal-noise ratio data. We studied the distribution rule of colored noise in GST domain, and proposed a technology to distinguish the signal and noise in GST domain. We discussed two types of noises: white noise and red noise, in which noise satisfy statistical autoregression model. For these two model, the noise-signal detection technology based on GST all get good result. It proved that the GST domain noise-signal detection technology could be used to real seismic data, and could effectively avoid noise influence on seismic sequence detecting.On the seismic profile after GST processing, high amplitude energy intensive zone, schollen, strip and lentoid dead zone and disarray zone maybe represent specifically geologic meanings according to given geologic background. Using seismic sequence detection profile and combining other seismic interpretation technologies, we can elaborate depict the shape of palaeo-geomorphology, effectively estimate sand stretch, distinguish sedimentary facies, determine target area, and directly guide oil-gas exploration.In the lateral reservoir prediction in XF oilfield of Ordos Basin, it played very important role in the estimation of sand stretch that the study of palaeo-geomorphology of Triassic System and the partition of inner sequence of the stratum group. According to the high-resolution seismic profile after GST processing, we pointed out that the C8 Member of Yanchang Formation in DZ area and C8 Member in BM area are the same deposit. It provided the foundation for getting 430 million tons predicting reserves and unite building 3 million tons off-take potential.In tackling key problem study for SLG gas-field, according to the high-resolution seismic sequence profile, we determined that the deposit direction of H8 member is approximately N-S or NNE-SS W. Using the seismic sequence profile, combining with layer-level profile, we can interpret the shape of entrenched stream. The sunken lenticle indicates the high-energy stream channel, which has stronger hydropower. By this way we drew out three high-energy stream channels' outline, and determined the target areas for exploitation. Finding high-energy braided river by high-resolution sequence processing is the key technology in SLG area.In ZZ area, we studied the distribution of the main reservoir bed-S23, which is shallow delta thin sand bed, by GST processing. From the seismic sequence profile, we discovered that the schollen thick sand beds are only local distributed, and most of them are distributary channel sand and distributary bar deposit. Then we determined that the S23 sand deposit direction is NW-SE in west, N-S in central and NE-SW in east. The high detecting seismic sequence interpretation profiles have been tested by 14 wells, 2 wells mismatch and the coincidence rate is 85.7%. Based on the profiles we suggested 3 predicted wells, one well (Yu54) completed and the other two is still drilling. The completed on Is coincident with the forecastThe paper testified that GST is a effective technology to get high- resolution seismic sequence profile, compartmentalize deposit microfacies, confirm strike direction of sandstone and make sure of the distribution range of oil-gas bearing sandstone, and is the gordian technique for the exploration of lithologic gas-oil pool in complicated areas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Manfred Beckmann, David P. Enot, David P. Overy, and John Draper (2007). Representation, comparison, and interpretation of metabolome fingerprint data for total composition analysis and quality trait investigation in potato cultivars. Journal of Agricultural and Food Chemistry, 55 (9) pp.3444-3451 RAE2008

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The authors explore nanoscale sensor processor (nSP) architectures. Their design includes a simple accumulator-based instruction-set architecture, sensors, limited memory, and instruction-fused sensing. Using nSP technology based on optical resonance energy transfer logic helps them decrease the design's size; their smallest design is about the size of the largest-known virus. © 2006 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND/AIMS: The obesity epidemic has spread to young adults, and obesity is a significant risk factor for cardiovascular disease. The prominence and increasing functionality of mobile phones may provide an opportunity to deliver longitudinal and scalable weight management interventions in young adults. The aim of this article is to describe the design and development of the intervention tested in the Cell Phone Intervention for You study and to highlight the importance of adaptive intervention design that made it possible. The Cell Phone Intervention for You study was a National Heart, Lung, and Blood Institute-sponsored, controlled, 24-month randomized clinical trial comparing two active interventions to a usual-care control group. Participants were 365 overweight or obese (body mass index≥25 kg/m2) young adults. METHODS: Both active interventions were designed based on social cognitive theory and incorporated techniques for behavioral self-management and motivational enhancement. Initial intervention development occurred during a 1-year formative phase utilizing focus groups and iterative, participatory design. During the intervention testing, adaptive intervention design, where an intervention is updated or extended throughout a trial while assuring the delivery of exactly the same intervention to each cohort, was employed. The adaptive intervention design strategy distributed technical work and allowed introduction of novel components in phases intended to help promote and sustain participant engagement. Adaptive intervention design was made possible by exploiting the mobile phone's remote data capabilities so that adoption of particular application components could be continuously monitored and components subsequently added or updated remotely. RESULTS: The cell phone intervention was delivered almost entirely via cell phone and was always-present, proactive, and interactive-providing passive and active reminders, frequent opportunities for knowledge dissemination, and multiple tools for self-tracking and receiving tailored feedback. The intervention changed over 2 years to promote and sustain engagement. The personal coaching intervention, alternatively, was primarily personal coaching with trained coaches based on a proven intervention, enhanced with a mobile application, but where all interactions with the technology were participant-initiated. CONCLUSION: The complexity and length of the technology-based randomized clinical trial created challenges in engagement and technology adaptation, which were generally discovered using novel remote monitoring technology and addressed using the adaptive intervention design. Investigators should plan to develop tools and procedures that explicitly support continuous remote monitoring of interventions to support adaptive intervention design in long-term, technology-based studies, as well as developing the interventions themselves.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fractal video compression is a relatively new video compression method. Its attraction is due to the high compression ratio and the simple decompression algorithm. But its computational complexity is high and as a result parallel algorithms on high performance machines become one way out. In this study we partition the matching search, which occupies the majority of the work in a fractal video compression process, into small tasks and implement them in two distributed computing environments, one using DCOM and the other using .NET Remoting technology, based on a local area network consists of loosely coupled PCs. Experimental results show that the parallel algorithm is able to achieve a high speedup in these distributed environments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: To develop an improved mathematical model for the prediction of dose accuracy of Dosators - based upon the geometry of the machine in conjunction with measured flow properties of the powder. Methods: A mathematical model has been created, based on a analytical method of differential slices - incorporating measured flow properties. The key flow properties of interest in this investigation were: flow function, effective angle of wall friction, wall adhesion, bulk density, stress ratio K and permeability. To simulate the real process and (very importantly) validate the model, a Dosator test-rig has been used to measure the forces acting on the Dosator during the filling stage, the force required to eject the dose and the dose weight. Results: Preliminary results were obtained from the Dosator test-rig. Figure 1 [Omitted] shows the dose weight for different depths to the bottom of the powder bed at the end of the stroke and different levels of pre-compaction of the powder bed. A strong influence over dose weight arising from the proximity between the Dosator and the bottom of the powder bed at the end of the stroke and the conditions of the powder bed has been established. Conclusions: The model will provide a useful tool to predict dosing accuracy and, thus, optimise the future design of Dosator based equipment technology – based on measured bulk properties of the powder to be handled. Another important factor (with a significant influence) on Dosator processes, is the condition of the powder bed and the clearance between the Dosator and the bottom of the powder bed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of the study is to determine the psychometric properties of the Epistemological Beliefs Questionnaire on Mathematics. 171 Secondary School Mathematics Teachers of the Central Region of Cuba participated. The results show acceptable internal consistency. The factorial structure of the scale revealed three major factors, consistent with the Model of the Three Constructs: beliefs about knowledge, about learning and teaching. Irregular levels in the development of the epistemological belief system about mathematics of these teachers were shown, with a tendency among naivety and sophistication poles. In conclusion, the questionnaire is useful for evaluating teacher’s beliefs about mathematics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives: To assess whether open angle glaucoma (OAG) screening meets the UK National Screening Committee criteria, to compare screening strategies with case finding, to estimate test parameters, to model estimates of cost and cost-effectiveness, and to identify areas for future research. Data sources: Major electronic databases were searched up to December 2005. Review methods: Screening strategies were developed by wide consultation. Markov submodels were developed to represent screening strategies. Parameter estimates were determined by systematic reviews of epidemiology, economic evaluations of screening, and effectiveness (test accuracy, screening and treatment). Tailored highly sensitive electronic searches were undertaken. Results: Most potential screening tests reviewed had an estimated specificity of 85% or higher. No test was clearly most accurate, with only a few, heterogeneous studies for each test. No randomised controlled trials (RCTs) of screening were identified. Based on two treatment RCTs, early treatment reduces the risk of progression. Extrapolating from this, and assuming accelerated progression with advancing disease severity, without treatment the mean time to blindness in at least one eye was approximately 23 years, compared to 35 years with treatment. Prevalence would have to be about 3-4% in 40 year olds with a screening interval of 10 years to approach cost-effectiveness. It is predicted that screening might be cost-effective in a 50-year-old cohort at a prevalence of 4% with a 10-year screening interval. General population screening at any age, thus, appears not to be cost-effective. Selective screening of groups with higher prevalence (family history, black ethnicity) might be worthwhile, although this would only cover 6% of the population. Extension to include other at-risk cohorts (e.g. myopia and diabetes) would include 37% of the general population, but the prevalence is then too low for screening to be considered cost-effective. Screening using a test with initial automated classification followed by assessment by a specialised optometrist, for test positives, was more cost-effective than initial specialised optometric assessment. The cost-effectiveness of the screening programme was highly sensitive to the perspective on costs (NHS or societal). In the base-case model, the NHS costs of visual impairment were estimated as £669. If annual societal costs were £8800, then screening might be considered cost-effective for a 40-year-old cohort with 1% OAG prevalence assuming a willingness to pay of £30,000 per quality-adjusted life-year. Of lesser importance were changes to estimates of attendance for sight tests, incidence of OAG, rate of progression and utility values for each stage of OAG severity. Cost-effectiveness was not particularly sensitive to the accuracy of screening tests within the ranges observed. However, a highly specific test is required to reduce large numbers of false-positive referrals. The findings that population screening is unlikely to be cost-effective are based on an economic model whose parameter estimates have considerable uncertainty, in particular, if rate of progression and/or costs of visual impairment are higher than estimated then screening could be cost-effective. Conclusions: While population screening is not cost-effective, the targeted screening of high-risk groups may be. Procedures for identifying those at risk, for quality assuring the programme, as well as adequate service provision for those screened positive would all be needed. Glaucoma detection can be improved by increasing attendance for eye examination, and improving the performance of current testing by either refining practice or adding in a technology-based first assessment, the latter being the more cost-effective option. This has implications for any future organisational changes in community eye-care services. Further research should aim to develop and provide quality data to populate the economic model, by conducting a feasibility study of interventions to improve detection, by obtaining further data on costs of blindness, risk of progression and health outcomes, and by conducting an RCT of interventions to improve the uptake of glaucoma testing. © Queen's Printer and Controller of HMSO 2007. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In an effort to develop a novel electronic paper image display technology based on the electrowetting principle, a 3-D electrowetting cell is designed and fabricated, which consists of two 3-D bent electrodes, each having a horizontal surface made of gold and a vertical surface made of indium tin oxide (ITO) glass as a color display window, a layer of dielectric material on the 3-D electrodes, and a highly fluorinated hydrophobic layer on the surface of the dielectric layer. Results of this work show that an electrowetting-induced motion of an aqueous droplet in immiscible oils can be achieved reversibly across the boundary of the horizontal and vertical surfaces of the 3-D electrode surface. It is also shown that the droplet can maintain its wetting state on a vertical sidewall electrode free of a power supplier when the voltage is removed. This phenomenon may form the basis for color contrast modulation applications, where a power-free image display is required, such as electronic paper display technology in the future. (C) 2009 Society of Photo-Optical Instrumentation Engineers. [DOI: 10.1117/1.3100201]