976 resultados para Pickering, Timothy, 1745-1829.
Jacobian-free Newton-Krylov methods with GPU acceleration for computing nonlinear ship wave patterns
Resumo:
The nonlinear problem of steady free-surface flow past a submerged source is considered as a case study for three-dimensional ship wave problems. Of particular interest is the distinctive wedge-shaped wave pattern that forms on the surface of the fluid. By reformulating the governing equations with a standard boundary-integral method, we derive a system of nonlinear algebraic equations that enforce a singular integro-differential equation at each midpoint on a two-dimensional mesh. Our contribution is to solve the system of equations with a Jacobian-free Newton-Krylov method together with a banded preconditioner that is carefully constructed with entries taken from the Jacobian of the linearised problem. Further, we are able to utilise graphics processing unit acceleration to significantly increase the grid refinement and decrease the run-time of our solutions in comparison to schemes that are presently employed in the literature. Our approach provides opportunities to explore the nonlinear features of three-dimensional ship wave patterns, such as the shape of steep waves close to their limiting configuration, in a manner that has been possible in the two-dimensional analogue for some time.
A finite volume method for solving the two-sided time-space fractional advection-dispersion equation
Resumo:
We present a finite volume method to solve the time-space two-sided fractional advection-dispersion equation on a one-dimensional domain. The spatial discretisation employs fractionally-shifted Grünwald formulas to discretise the Riemann-Liouville fractional derivatives at control volume faces in terms of function values at the nodes. We demonstrate how the finite volume formulation provides a natural, convenient and accurate means of discretising this equation in conservative form, compared to using a conventional finite difference approach. Results of numerical experiments are presented to demonstrate the effectiveness of the approach.
Resumo:
For clinical use, in electrocardiogram (ECG) signal analysis it is important to detect not only the centre of the P wave, the QRS complex and the T wave, but also the time intervals, such as the ST segment. Much research focused entirely on qrs complex detection, via methods such as wavelet transforms, spline fitting and neural networks. However, drawbacks include the false classification of a severe noise spike as a QRS complex, possibly requiring manual editing, or the omission of information contained in other regions of the ECG signal. While some attempts were made to develop algorithms to detect additional signal characteristics, such as P and T waves, the reported success rates are subject to change from person-to-person and beat-to-beat. To address this variability we propose the use of Markov-chain Monte Carlo statistical modelling to extract the key features of an ECG signal and we report on a feasibility study to investigate the utility of the approach. The modelling approach is examined with reference to a realistic computer generated ECG signal, where details such as wave morphology and noise levels are variable.
Resumo:
A Remote Sensing Core Curriculum (RSCC) development project is currently underway. This project is being conducted under the auspices of the National Center for Geographic Information and Analysis (NCGIA). RSCC is an outgrowth of the NCGIA GIS Core Curriculum project. It grew out of discussions begun at NCGIA, Initiative 12 (I-12): 'Integration of Remote Sensing and Geographic Information Systems'. This curriculum development project focuses on providing professors, teachers and instructors in undergraduate and graduate institutions with course materials from experts in specific subject matter for areas use in the class room.
Resumo:
The vision of a digital earth (DE) is continuously evolving, and the next-generation infrastructures, platforms and applications are being implemented. In this article, we attempt to initiate a debate within the DE and with affine communities about 'why' a digital earth curriculum (DEC) is needed, 'how' it should be developed, and 'what' it could look like. It is impossible to do justice to the Herculean effort of DEC development without extensive consultations with the broader community. We propose a frame for the debate (what, why, and how of a DEC) and a rationale for and elements of a curriculum for educating the coming generations of digital natives and indicate possible realizations. We particularly argue that a DEC is not a déjà vu of classical research and training agendas of geographic information science, remote sensing, and similar fields by emphasizing its unique characteristics.
Resumo:
A modular, graphic-oriented Internet browser has been developed to enable non-technical client access to a literal spinning world of information and remotely sensed. The Earth Portal (www.earthportal.net) uses the ManyOne browser (www.manyone.net) to provide engaging point and click views of the Earth fully tessellated with remotely sensed imagery and geospatial data. The ManyOne browser technology use Mozilla with embedded plugins to apply multiple 3-D graphics engines, e.g. ArcGlobe or GeoFusion, that directly link with the open-systems architecture of the geo-spatial infrastructure. This innovation allows for rendering of satellite imagery directly over the Earth's surface and requires no technical training by the web user. Effective use of this global distribution system for the remote sensing community requires a minimal compliance with protocols and standards that have been promoted by NSDI and other open-systems standards organizations.
Resumo:
A variety of sustainable development research efforts and related activities are attempting to reconcile the issues of conserving our natural resources without limiting economic motivation while also improving our social equity and quality of life. Land use/land cover change, occurring on a global scale, is an aggregate of local land use decisions and profoundly impacts our environment. It is therefore the local decision making process that should be the eventual target of many of the ongoing data collection and research efforts which strive toward supporting a sustainable future. Satellite imagery data is a primary source of data upon which to build a core data set for use by researchers in analyzing this global change. A process is necessary to link global change research, utilizing satellite imagery, to the local land use decision making process. One example of this is the NASA-sponsored Regional Data Center (RDC) prototype. The RDC approach is an attempt to integrate science and technology at the community level. The anticipated result of this complex interaction between research and the decision making communities will be realized in the form of long-term benefits to the public.
Resumo:
Beginning in 1974, the State of Maryland created spatial databases under the MAGI (Maryland's Automated Geographic Information) system. Since that early GIS, other state and local agencies have begun GISs covering a range of applications from critical lands inventories to cadastral mapping. In 1992, state agencies, local agencies, universities, and businesses began a series of GIS coordination activities, resulting in the formation of the Maryland Local Geographic Information Committee and the Maryland State Government Geographic Information Coordinating Committee. GIS activities and system installations can be found in 22 counties plus Baltimore City, and most state agencies. Maryland's decision makers rely on a variety of GIS reports and products to conduct business and to communicate complex issues more effectively. This paper presents the status of Maryland's GIS applications for local and state decision making.
Resumo:
Historically, it appears that some of the WRCF have survived because i) they lack sufficient quantity of commercially valuable species; ii) they are located in remote or inaccessible areas; or iii) they have been protected as national parks and sanctuaries. Forests will be protected when people who are deciding the fate of forests conclude than the conservation of forests is more beneficial, e.g. generates higher incomes or has cultural or social values, than their clearance. If this is not the case, forests will continue to be cleared and converted. In the future, the WRCF may be protected only by focused attention. The future policy options may include strategies for strong protection measures, the raising of public awareness about the value of forests, and concerted actions for reducing pressure on forest lands by providing alternatives to forest exploitation to meet the growing demands of forest products. Many areas with low population densities offer an opportunity for conservation if appropriate steps are taken now by the national governments and international community. This opportunity must be founded upon the increased public and government awareness that forests have vast importance to the welfare of humans and ecosystems' services such as biodiversity, watershed protection, and carbon balance. Also paramount to this opportunity is the increased scientific understanding of forest dynamics and technical capability to install global observation and assessment systems. High-resolution satellite data such as Landsat 7 and other technologically advanced satellite programs will provide unprecedented monitoring options for governing authorities. Technological innovation can contribute to the way forests are protected. The use of satellite imagery for regular monitoring and Internet for information dissemination provide effective tools for raising worldwide awareness about the significance of forests and intrinsic value of nature.
Resumo:
Experience gained from numerous projects conducted by the U.S. Environmental Protection Agency's (EPA) Environmental Monitoring Systems Laboratory in Las Vegas, Nevada has provided insight to functional issues of mapping, monitoring, and modeling of wetland habitats. Three case studies in poster form describe these issues pertinent to managing wetland resources as mandated under Federal laws. A multiphase project was initiated by the EPA Alaska operations office to provide detailed wetland mapping of arctic plant communities in an area under petroleum development pressure. Existing classification systems did not meet EPA needs. Therefore a Habitat Classification System (HCS) derived from aerial photography was compiled. In conjunction with this photointerpretive keys were developed. These products enable EPA personnel to map large inaccessible areas of the arctic coastal plain and evaluate the sensitivity of various wetland habitats relative to petroleum development needs.
Resumo:
Next Generation Sequencing (NGS) has revolutionised molecular biology, resulting in an explosion of data sets and an increasing role in clinical practice. Such applications necessarily require rapid identification of the organism as a prelude to annotation and further analysis. NGS data consist of a substantial number of short sequence reads, given context through downstream assembly and annotation, a process requiring reads consistent with the assumed species or species group. Highly accurate results have been obtained for restricted sets using SVM classifiers, but such methods are difficult to parallelise and success depends on careful attention to feature selection. This work examines the problem at very large scale, using a mix of synthetic and real data with a view to determining the overall structure of the problem and the effectiveness of parallel ensembles of simpler classifiers (principally random forests) in addressing the challenges of large scale genomics.
Resumo:
Current military conflicts are characterized by the use of the improvised explosive device. Improvements in personal protection, medical care, and evacuation logistics have resulted in increasing numbers of casualties surviving with complex musculoskeletal injuries, often leading to lifelong disability. Thus, there exists an urgent requirement to investigate the mechanism of extremity injury caused by these devices in order to develop mitigation strategies. In addition, the wounds of war are no longer restricted to the battlefield; similar injuries can be witnessed in civilian centers following a terrorist attack. Key to understanding such mechanisms of injury is the ability to deconstruct the complexities of an explosive event into a controlled, laboratory-based environment. In this article, a traumatic injury simulator, designed to recreate in the laboratory the impulse that is transferred to the lower extremity from an anti-vehicle explosion, is presented and characterized experimentally and numerically. Tests with instrumented cadaveric limbs were then conducted to assess the simulator’s ability to interact with the human in two mounting conditions, simulating typical seated and standing vehicle passengers. This experimental device will now allow us to (a) gain comprehensive understanding of the load-transfer mechanisms through the lower limb, (b) characterize the dissipating capacity of mitigation technologies, and (c) assess the bio-fidelity of surrogates.
Resumo:
The lower limb of military vehicle occupants has been the most injured body part due to undervehicle explosions in recent conflicts. Understanding the injury mechanism and causality of injury severity could aid in developing better protection. Therefore, we tested 4 different occupant postures (seated, brace, standing, standing with knee locked in hyper‐extension) in a simulated under‐vehicle explosion (solid blast) using our traumatic injury simulator in the laboratory; we hypothesised that occupant posture would affect injury severity. No skeletal injury was observed in the specimens in seated and braced postures. Severe, impairing injuries were observed in the foot of standing and hyper‐extended specimens. These results demonstrate that a vehicle occupant whose posture at the time of the attack incorporates knee flexion is more likely to be protected against severe skeletal injury to the lower leg.
Resumo:
Lower extremities are particularly susceptible to injury in an under‐vehicle explosion. Operational fitness of military vehicles is assessed through anthropometric test devices (ATDs) in full‐scale blast tests. The aim of this study was to compare the response between the Hybrid‐III ATD, the MiL‐Lx ATD and cadavers in our traumatic injury simulator, which is able to replicate the response of the vehicle floor in an under‐vehicle explosion. All specimens were fitted with a combat boot and tested on our traumatic injury simulator in a seated position. The load recorded in the ATDs was above the tolerance levels recommended by NATO in all tests; no injuries were observed in any of the 3 cadaveric specimens. The Hybrid‐III produced higher peak forces than the MiL‐Lx. The time to peak strain in the calcaneus of the cadavers was similar to the time to peak force in the ATDs. Maximum compression of the sole of the combat boot was similar for cadavers and MiL‐Lx, but significantly greater for the Hybrid‐III. These results suggest that the MiL‐Lx has a more biofidelic response to under‐vehicle explosive events compared to the Hybrid‐III. Therefore, it is recommended that mitigation strategies are assessed using the MiL‐Lx surrogate and not the Hybrid‐III.