934 resultados para Optimizing Compilation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sustainable yields from water wells in hard-rock aquifers are achieved when the well bore intersects fracture networks. Fracture networks are often not readily discernable at the surface. Lineament analysis using remotely sensed satellite imagery has been employed to identify surface expressions of fracturing, and a variety of image-analysis techniques have been successfully applied in “ideal” settings. An ideal setting for lineament detection is where the influences of human development, vegetation, and climatic situations are minimal and hydrogeological conditions and geologic structure are known. There is not yet a well-accepted protocol for mapping lineaments nor have different approaches been compared in non-ideal settings. A new approach for image-processing/synthesis was developed to identify successful satellite imagery types for lineament analysis in non-ideal terrain. Four satellite sensors (ASTER, Landsat7 ETM+, QuickBird, RADARSAT-1) and a digital elevation model were evaluated for lineament analysis in Boaco, Nicaragua, where the landscape is subject to varied vegetative cover, a plethora of anthropogenic features, and frequent cloud cover that limit the availability of optical satellite data. A variety of digital image processing techniques were employed and lineament interpretations were performed to obtain 12 complementary image products that were evaluated subjectively to identify lineaments. The 12 lineament interpretations were synthesized to create a raster image of lineament zone coincidence that shows the level of agreement among the 12 interpretations. A composite lineament interpretation was made using the coincidence raster to restrict lineament observations to areas where multiple interpretations (at least 4) agree. Nine of the 11 previously mapped faults were identified from the coincidence raster. An additional 26 lineaments were identified from the coincidence raster, and the locations of 10 were confirmed by field observation. Four manual pumping tests suggest that well productivity is higher for wells proximal to lineament features. Interpretations from RADARSAT-1 products were superior to interpretations from other sensor products, suggesting that quality lineament interpretation in this region requires anthropogenic features to be minimized and topographic expressions to be maximized. The approach developed in this study has the potential to improve siting wells in non-ideal regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

More than eighteen percent of the world’s population lives without reliable access to clean water, forced to walk long distances to get small amounts of contaminated surface water. Carrying heavy loads of water long distances and ingesting contaminated water can lead to long-term health problems and even death. These problems affect the most vulnerable populations, women, children, and the elderly, more than anyone else. Water access is one of the most pressing issues in development today. Boajibu, a small village in Sierra Leone, where the author served in Peace Corps for two years, lacks access to clean water. Construction of a water distribution system was halted when a civil war broke out in 1992 and has not been continued since. The community currently relies on hand-dug and borehole wells that can become dirty during the dry season, which forces people to drink contaminated water or to travel a far distance to collect clean water. This report is intended to provide a design the system as it was meant to be built. The water system design was completed based on the taps present, interviews with local community leaders, local surveying, and points taken with a GPS. The design is a gravity-fed branched water system, supplied by a natural spring on a hill adjacent to Boajibu. The system’s source is a natural spring on a hill above Boajibu, but the flow rate of the spring is unknown. There has to be enough flow from the spring over a 24-hour period to meet the demands of the users on a daily basis, or what is called providing continuous flow. If the spring has less than this amount of flow, the system must provide intermittent flow, flow that is restricted to a few hours a day. A minimum flow rate of 2.1 liters per second was found to be necessary to provide continuous flow to the users of Boajibu. If this flow is not met, intermittent flow can be provided to the users. In order to aid the construction of a distribution system in the absence of someone with formal engineering training, a table was created detailing water storage tank sizing based on possible source flow rates. A builder can interpolate using the source flow rate found to get the tank size from the table. However, any flow rate below 2.1 liters per second cannot be used in the table. In this case, the builder should size the tank such that it can take in the water that will be supplied overnight, as all the water will be drained during the day because the users will demand more than the spring can supply through the night. In the developing world, there is often a problem collecting enough money to fund large infrastructure projects, such as a water distribution system. Often there is only enough money to add only one or two loops to a water distribution system. It is helpful to know where these one or two loops can be most effectively placed in the system. Various possible loops were designated for the Boajibu water distribution system and the Adaptive Greedy Heuristic Loop Addition Selection Algorithm (AGHLASA) was used to rank the effectiveness of the possible loops to construct. Loop 1 which was furthest upstream was selected because it benefitted the most people for the least cost. While loops which were further downstream were found to be less effective because they would benefit fewer people. Further studies should be conducted on the water use habits of the people of Boajibu to more accurately predict the demands that will be placed on the system. Further population surveying should also be conducted to predict population change over time so that the appropriate capacity can be built into the system to accommodate future growth. The flow at the spring should be measured using a V-notch weir and the system adjusted accordingly. Future studies can be completed adjusting the loop ranking method so that two users who may be using the water system for different lengths of time are not counted the same and vulnerable users are weighted more heavily than more robust users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The distance learning programme has made its entrance into nursing education, and many see it as a break with the education’s traditions of teaching in the classroom, in practise rooms and at the patient’s bedside (Chaffin & Maddux 2004). Traditionally, many of the technical skills and personal qualities that nurses must acquire are learned through interaction with others. The distance learning programme has therefore given rise to some new problems and challenges, and this article discusses some of these. Empirically, the article builds on a comparative study of three student nurse classes from two Danish nursing schools, including one based on the distance learning programme. By following both distance learning and traditional nursing students in their clinical training, light is cast upon the differences and similarities that may exist in the clinical skills and competences that the students gain under the two programmes. Theoretically the article builds on Etienne Wengers theory on learning in communities of practice, focusing on the relationship between experience and competence in learning related communities of practice (Wenger 1998;Wenger 2004) . The article contributes with findings that are related to the differences between the programmes and the different types of students that each programme attracts. The article argues that an increased didactic and pedagogical focus upon the field of tension between experience and competence will enable an optimisation of the learning conditions of the distance learning students in their clinical teaching. The article, in conclusion, thus places focus on the questions surrounding teaching design in relation to the distance learning programme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study reports for the first time the optimization of the infrared (1523 nm) to near-infrared (980 nm) upconversion quantum yield (UC-QY) of hexagonal trivalent erbium doped sodium yttrium fluoride (β-NaYF4:Er3+) in a perfluorocyclobutane (PFCB) host matrix under monochromatic excitation. Maximum internal and external UC-QYs of 8.4% ± 0.8% and 6.5% ± 0.7%, respectively, have been achieved for 1523 nm excitation of 970 ± 43 Wm−2 for an optimum Er3+ concentration of 25 mol% and a phosphor concentration of 84.9 w/w% in the matrix. These results correspond to normalized internal and external efficiencies of 0.86 ± 0.12 cm2 W−1 and 0.67 ± 0.10 cm2 W−1, respectively. These are the highest values ever reported for β-NaYF4:Er3+ under monochromatic excitation. The special characteristics of both the UC phosphor β-NaYF4:Er3+ and the PFCB matrix give rise to this outstanding property. Detailed power and time dependent luminescence measurements reveal energy transfer upconversion as the dominant UC mechanism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Performance and carcass data from 624 steers in three experiments were used to evaluate potential strategies that might be used with incoming feeders to remove animals that produce low value carcasses when cattle are sold in a valuebased grid. Removing 10% of the carcasses with the lowest net value from each group increased the overall average net value of the remaining carcasses $17.50 to $21.09. Carcass weight was found to be the most significant factor determining net value of the carcass. Gain of the steers during the first 3 to 5 weeks of the feeding period was significantly related to average final gain and carcass value, but accounted for a small portion of the overall variation in gain or carcass value. Use of initial gain was successful in identifying ten of the sixty-four carcasses with least net value in a value-based grid. Adding frame score and measurement of initial thickness of backfat along with initial gain did not significantly improve identification of the low-value carcasses. Sorting the steers as feeders based on frame score and initial thickness of backfat resulted in differences in performance and carcass measurements. The low-value carcasses tended to be concentrated in the smaller-framed steers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We studied the influence of surveyed area size on density estimates by means of camera-trapping in a low-density felid population (1-2 individuals/100 km(2) ). We applied non-spatial capture-recapture (CR) and spatial CR (SCR) models for Eurasian lynx during winter 2005/2006 in the northwestern Swiss Alps by sampling an area divided into 5 nested plots ranging from 65 to 760 km(2) . CR model density estimates (95% CI) for models M0 and Mh decreased from 2.61 (1.55-3.68) and 3.6 (1.62-5.57) independent lynx/100 km(2) , respectively, in the smallest to 1.20 (1.04-1.35) and 1.26 (0.89-1.63) independent lynx/100 km(2) , respectively, in the largest area surveyed. SCR model density estimates also decreased with increasing sampling area but not significantly. High individual range overlaps in relatively small areas (the edge effect) is the most plausible reason for this positive bias in the CR models. Our results confirm that SCR models are much more robust to changes in trap array size than CR models, thus avoiding overestimation of density in smaller areas. However, when a study is concerned with monitoring population changes, large spatial efforts (area surveyed ≥760 km(2) ) are required to obtain reliable and precise density estimates with these population densities and recapture rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A patient classification system was developed integrating a patient acuity instrument with a computerized nursing distribution method based on a linear programming model. The system was designed for real-time measurement of patient acuity (workload) and allocation of nursing personnel to optimize the utilization of resources.^ The acuity instrument was a prototype tool with eight categories of patients defined by patient severity and nursing intensity parameters. From this tool, the demand for nursing care was defined in patient points with one point equal to one hour of RN time. Validity and reliability of the instrument was determined as follows: (1) Content validity by a panel of expert nurses; (2) predictive validity through a paired t-test analysis of preshift and postshift categorization of patients; (3) initial reliability by a one month pilot of the instrument in a practice setting; and (4) interrater reliability by the Kappa statistic.^ The nursing distribution system was a linear programming model using a branch and bound technique for obtaining integer solutions. The objective function was to minimize the total number of nursing personnel used by optimally assigning the staff to meet the acuity needs of the units. A penalty weight was used as a coefficient of the objective function variables to define priorities for allocation of staff.^ The demand constraints were requirements to meet the total acuity points needed for each unit and to have a minimum number of RNs on each unit. Supply constraints were: (1) total availability of each type of staff and the value of that staff member (value was determined relative to that type of staff's ability to perform the job function of an RN (i.e., value for eight hours RN = 8 points, LVN = 6 points); (2) number of personnel available for floating between units.^ The capability of the model to assign staff quantitatively and qualitatively equal to the manual method was established by a thirty day comparison. Sensitivity testing demonstrated appropriate adjustment of the optimal solution to changes in penalty coefficients in the objective function and to acuity totals in the demand constraints.^ Further investigation of the model documented: correct adjustment of assignments in response to staff value changes; and cost minimization by an addition of a dollar coefficient to the objective function. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The North Atlantic spring bloom is one of the main events that lead to carbon export to the deep ocean and drive oceanic uptake of CO(2) from the atmosphere. Here we use a suite of physical, bio-optical and chemical measurements made during the 2008 spring bloom to optimize and compare three different models of biological carbon export. The observations are from a Lagrangian float that operated south of Iceland from early April to late June, and were calibrated with ship-based measurements. The simplest model is representative of typical NPZD models used for the North Atlantic, while the most complex model explicitly includes diatoms and the formation of fast sinking diatom aggregates and cysts under silicate limitation. We carried out a variational optimization and error analysis for the biological parameters of all three models, and compared their ability to replicate the observations. The observations were sufficient to constrain most phytoplankton-related model parameters to accuracies of better than 15 %. However, the lack of zooplankton observations leads to large uncertainties in model parameters for grazing. The simulated vertical carbon flux at 100 m depth is similar between models and agrees well with available observations, but at 600 m the simulated flux is larger by a factor of 2.5 to 4.5 for the model with diatom aggregation. While none of the models can be formally rejected based on their misfit with the available observations, the model that includes export by diatom aggregation has a statistically significant better fit to the observations and more accurately represents the mechanisms and timing of carbon export based on observations not included in the optimization. Thus models that accurately simulate the upper 100 m do not necessarily accurately simulate export to deeper depths.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Patients presenting to the emergency department (ED) currently face inacceptable delays in initial treatment, and long, costly hospital stays due to suboptimal initial triage and site-of-care decisions. Accurate ED triage should focus not only on initial treatment priority, but also on prediction of medical risk and nursing needs to improve site-of-care decisions and to simplify early discharge management. Different triage scores have been proposed, such as the Manchester triage system (MTS). Yet, these scores focus only on treatment priority, have suboptimal performance and lack validation in the Swiss health care system. Because the MTS will be introduced into clinical routine at the Kantonsspital Aarau, we propose a large prospective cohort study to optimize initial patient triage. Specifically, the aim of this trial is to derive a three-part triage algorithm to better predict (a) treatment priority; (b) medical risk and thus need for in-hospital treatment; (c) post-acute care needs of patients at the most proximal time point of ED admission. Methods/design: Prospective, observational, multicenter, multi-national cohort study. We will include all consecutive medical patients seeking ED care into this observational registry. There will be no exclusions except for non-adult and non-medical patients. Vital signs will be recorded and left over blood samples will be stored for later batch analysis of blood markers. Upon ED admission, the post-acute care discharge score (PACD) will be recorded. Attending ED physicians will adjudicate triage priority based on all available results at the time of ED discharge to the medical ward. Patients will be reassessed daily during the hospital course for medical stability and readiness for discharge from the nurses and if involved social workers perspective. To assess outcomes, data from electronic medical records will be used and all patients will be contacted 30 days after hospital admission to assess vital and functional status, re-hospitalization, satisfaction with care and quality of life measures. We aim to include between 5000 and 7000 patients over one year of recruitment to derive the three-part triage algorithm. The respective main endpoints were defined as (a) initial triage priority (high vs. low priority) adjudicated by the attending ED physician at ED discharge, (b) adverse 30 day outcome (death or intensive care unit admission) within 30 days following ED admission to assess patients risk and thus need for in-hospital treatment and (c) post acute care needs after hospital discharge, defined as transfer of patients to a post-acute care institution, for early recognition and planning of post-acute care needs. Other outcomes are time to first physician contact, time to initiation of adequate medical therapy, time to social worker involvement, length of hospital stay, reasons fordischarge delays, patient’s satisfaction with care, overall hospital costs and patients care needs after returning home. Discussion: Using a reliable initial triage system for estimating initial treatment priority, need for in-hospital treatment and post-acute care needs is an innovative and persuasive approach for a more targeted and efficient management of medical patients in the ED. The proposed interdisciplinary , multi-national project has unprecedented potential to improve initial triage decisions and optimize resource allocation to the sickest patients from admission to discharge. The algorithms derived in this study will be compared in a later randomized controlled trial against a usual care control group in terms of resource use, length of hospital stay, overall costs and patient’s outcomes in terms of mortality, re-hospitalization, quality of life and satisfaction with care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Terrestrial records of past climatic conditions, such as lake sediments and speleothems, provide data of great importance for understanding environmental changes. However, unlike marine and ice core records, terrestrial palaeodata are often not available in databases or in a format that is easily accessible to the non-specialist. As a consequence, many excellent terrestrial records are unknown to the broader palaeoclimate community and are not included in compilations, comparisons, or modelling exercises. Here we present a compilation of Western European terrestrial palaeo-records covering, entirely or partially, the 60–8-ka INTIMATE time period. The compilation contains 56 natural archives, including lake records, speleothems, ice cores, and terrestrial proxies in marine records. The compilation is limited to include records of high temporal resolution and/or records that provide climate proxies or quantitative reconstructions of environmental parameters, such as temperature or precipitation, and that are of relevance and interest to a broader community. We briefly review the different types of terrestrial archives, their respective proxies, their interpretation and their application for palaeoclimatic reconstructions. We also discuss the importance of independent chronologies and the issue of record synchronization. The aim of this exercise is to provide the wider palaeo-community with a consistent compilation of high-quality terrestrial records, to facilitate model-data comparisons, and to identify key areas of interest for future investigations. We use the compilation to investigate Western European latitudinal climate gradients during the deglacial period and, despite of poorly constrained chronologies for the older records, we summarize the main results obtained from NW and SW European terrestrial records before the LGM.