988 resultados para database design
Resumo:
Magnetic resonance imaging (MRI) magnets have very stringent constraints on the homogeneity of the static magnetic field that they generate over desired imaging regions. The magnet system also preferably generates very little stray field external to its structure, so that ease of siting and safety are assured. This work concentrates on deriving, means of rapidly computing the effect of 'cold' and 'warm' ferromagnetic material in or around the superconducting magnet system, so as to facilitate the automated design of hybrid material MR magnets. A complete scheme for the direct calculation of the spherical harmonics of the magnetic field generated by a circular ring of ferromagnetic material is derived under the conditions of arbitrary external magnetizing fields. The magnetic field produced by the superconducting coils in the system is computed using previously developed methods. The final, hybrid algorithm is fast enough for use in large-scale optimization methods. The resultant fields from a practical example of a 4 T, clinical MRI magnet containing both superconducting coils and magnetic material are presented.
Resumo:
Supporting student learning can be difficult, especially within open-ended or loosely structured activities, often seen as valuable for promoting student autonomy in many curriculum areas and contexts. This paper reports an investigation into the experiences of three teachers who implemented design and technology education ideas in their primary school classrooms for the first time. The teachers did not capitalise upon many of the opportunities for scaffolding their students' learning within the open-ended activities they implemented. Limitations of the teachers' conceptual and procedural knowledge of design and technology were elements that influenced their early experiences. The study has implications for professional developers planning programs in newly introduced areas of the curriculum to support teachers in supporting learning within open-ended and loosely structured problem solving activities. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Using benthic habitat data from the Florida Keys (USA), we demonstrate how siting algorithms can help identify potential networks of marine reserves that comprehensively represent target habitat types. We applied a flexible optimization tool-simulated annealing-to represent a fixed proportion of different marine habitat types within a geographic area. We investigated the relative influence of spatial information, planning-unit size, detail of habitat classification, and magnitude of the overall conservation goal on the resulting network scenarios. With this method, we were able to identify many adequate reserve systems that met the conservation goals, e.g., representing at least 20% of each conservation target (i.e., habitat type) while fulfilling the overall aim of minimizing the system area and perimeter. One of the most useful types of information provided by this siting algorithm comes from an irreplaceability analysis, which is a count of the number of, times unique planning units were included in reserve system scenarios. This analysis indicated that many different combinations of sites produced networks that met the conservation goals. While individual 1-km(2) areas were fairly interchangeable, the irreplaceability analysis highlighted larger areas within the planning region that were chosen consistently to meet the goals incorporated into the algorithm. Additionally, we found that reserve systems designed with a high degree of spatial clustering tended to have considerably less perimeter and larger overall areas in reserve-a configuration that may be preferable particularly for sociopolitical reasons. This exercise illustrates the value of using the simulated annealing algorithm to help site marine reserves: the approach makes efficient use of;available resources, can be used interactively by conservation decision makers, and offers biologically suitable alternative networks from which an effective system of marine reserves can be crafted.
Resumo:
Like many states and territories, South Australia has a legacy of marine reserves considered to be inadequate to meet current conservation objectives. In this paper we configured exploratory marine reserve systems, using the software MARXAN, to examine how efficiently South Australia's existing marine reserves contribute to quantitative biodiversity conservation targets. Our aim was to compare marine reserve systems that retain South Australia's existing marine reserves with reserve systems that are free to either ignore or incorporate them. We devised a new interpretation of irreplaceability to identify planning units selected more than could be expected from chance alone. This is measured by comparing the observed selection frequency for an individual planning unit with a predicted selection frequency distribution. Knowing which sites make a valuable contribution to efficient marine reserve system design allows us to determine how well South Australia's existing reserves contribute to reservation goals when representation targets are set at 5, 10, 15, 20, 30 and 50% of conservation features. Existing marine reserves that tail to contribute to efficient marine reserve systems constitute 'opportunity costs'. We found that despite spanning less than 4% of South Australian state waters, locking in the existing ad hoc marine reserves presented considerable opportunity costs. Even with representation targets set at 50%, more than halt of South Australia's existing marine reserves were selected randomly or less in efficient marine reserve systems. Hence, ad hoc marine reserve systems are likely to be inefficient and may compromise effective conservation of marine biodiversity.
Resumo:
As end-user computing becomes more pervasive, an organization's success increasingly depends on the ability of end-users, usually in managerial positions, to extract appropriate data from both internal and external sources. Many of these data sources include or are derived from the organization's accounting information systems. Managerial end-users with different personal characteristics and approaches are likely to compose queries of differing levels of accuracy when searching the data contained within these accounting information systems. This research investigates how cognitive style elements of personality influence managerial end-user performance in database querying tasks. A laboratory experiment was conducted in which participants generated queries to retrieve information from an accounting information system to satisfy typical information requirements. The experiment investigated the influence of personality on the accuracy of queries of varying degrees of complexity. Relying on the Myers–Briggs personality instrument, results show that perceiving individuals (as opposed to judging individuals) who rely on intuition (as opposed to sensing) composed queries more accurately. As expected, query complexity and academic performance also explain the success of data extraction tasks.
Resumo:
Computer Science is a subject which has difficulty in marketing itself. Further, pinning down a standard curriculum is difficult-there are many preferences which are hard to accommodate. This paper argues the case that part of the problem is the fact that, unlike more established disciplines, the subject does not clearly distinguish the study of principles from the study of artifacts. This point was raised in Curriculum 2001 discussions, and debate needs to start in good time for the next curriculum standard. This paper provides a starting point for debate, by outlining a process by which principles and artifacts may be separated, and presents a sample curriculum to illustrate the possibilities. This sample curriculum has some positive points, though these positive points are incidental to the need to start debating the issue. Other models, with a less rigorous ordering of principles before artifacts, would still gain from making it clearer whether a specific concept was fundamental, or a property of a specific technology. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
OTseeker (Occupational Therapy Systematic Evaluation of Evidence) is a new resource for occupational therapists that has been designed with the principle aim of increasing access to research to support clinical decisions. It contains abstracts of systematic reviews and quality ratings of randomized controlled trials (RCTs) relevant to occupational therapy. It is available, free of charge, at www.otseeker.com. This paper describes the OTseeker database and provides an example of how it may support occupational therapy practice.
Resumo:
One of the most important advantages of database systems is that the underlying mathematics is rich enough to specify very complex operations with a small number of statements in the database language. This research covers an aspect of biological informatics that is the marriage of information technology and biology, involving the study of real-world phenomena using virtual plants derived from L-systems simulation. L-systems were introduced by Aristid Lindenmayer as a mathematical model of multicellular organisms. Not much consideration has been given to the problem of persistent storage for these simulations. Current procedures for querying data generated by L-systems for scientific experiments, simulations and measurements are also inadequate. To address these problems the research in this paper presents a generic process for data-modeling tools (L-DBM) between L-systems and database systems. This paper shows how L-system productions can be generically and automatically represented in database schemas and how a database can be populated from the L-system strings. This paper further describes the idea of pre-computing recursive structures in the data into derived attributes using compiler generation. A method to allow a correspondence between biologists' terms and compiler-generated terms in a biologist computing environment is supplied. Once the L-DBM gets any specific L-systems productions and its declarations, it can generate the specific schema for both simple correspondence terminology and also complex recursive structure data attributes and relationships.
Resumo:
The most widely used method for predicting the onset of continuous caving is Laubscher's caving chart. A detailed examination of this method was undertaken which concluded that it had limitations which may impact on results, particularly when dealing with stronger rock masses that are outside current experience. These limitations relate to inadequate guidelines for adjustment factors to rock mass rating (RMR), concerns about the position on the chart of critical case history data, undocumented changes to the method and an inadequate number of data points to be confident of stability boundaries. A review was undertaken on the application and reliability of a numerical method of assessing cavability. The review highlighted a number of issues, which at this stage, make numerical continuum methods problematic for predicting cavability. This is in particular reference to sensitivity to input parameters that are difficult to determine accurately and mesh dependency. An extended version of the Mathews method for open stope design was developed as an alternative method of predicting the onset of continuous caving. A number of caving case histories were collected and analyzed and a caving boundary delineated statistically on the Mathews stability graph. The definition of the caving boundary was aided by the existence of a large and wide-ranging stability database from non-caving mines. A caving rate model was extrapolated from the extended Mathews stability graph but could only be partially validated due to a lack of reliable data.
Resumo:
Measurement while drilling (MWD) techniques can provide a useful tool to aid drill and blast engineers in open cut mining. By avoiding time consuming tasks such as scan-lines and rock sample collection for laboratory tests, MWD techniques can not only save time but also improve the reliability of the blast design by providing the drill and blast engineer with the information specially tailored for use. While most mines use a standard blast pattern and charge per blasthole, based on a single rock factor for the entire bench or blast region, information derived from the MWD parameters can improve the blast design by providing more accurate rock properties for each individual blasthole. From this, decisions can be made on the most appropriate type and amount of explosive charge to place in a per blasthole or to optimise the inter-hole timing detonation time of different decks and blastholes. Where real-time calculations are feasible, the system could extend the present blast design even be used to determine the placement of subsequent holes towards a more appropriate blasthole pattern design like asymmetrical blasting.
Resumo:
Blasting has been the most frequently used method for rock breakage since black powder was first used to fragment rocks, more than two hundred years ago. This paper is an attempt to reassess standard design techniques used in blasting by providing an alternative approach to blast design. The new approach has been termed asymmetric blasting. Based on providing real time rock recognition through the capacity of measurement while drilling (MWD) techniques, asymmetric blasting is an approach to deal with rock properties as they occur in nature, i.e., randomly and asymmetrically spatially distributed. It is well accepted that performance of basic mining operations, such as excavation and crushing rely on a broken rock mass which has been pre conditioned by the blast. By pre-conditioned we mean well fragmented, sufficiently loose and with adequate muckpile profile. These muckpile characteristics affect loading and hauling [1]. The influence of blasting does not end there. Under the Mine to Mill paradigm, blasting has a significant leverage on downstream operations such as crushing and milling. There is a body of evidence that blasting affects mineral liberation [2]. Thus, the importance of blasting has increased from simply fragmenting and loosing the rock mass, to a broader role that encompasses many aspects of mining, which affects the cost of the end product. A new approach is proposed in this paper which facilitates this trend 'to treat non-homogeneous media (rock mass) in a non-homogeneous manner (an asymmetrical pattern) in order to achieve an optimal result (in terms of muckpile size distribution).' It is postulated there are no logical reasons (besides the current lack of means to infer rock mass properties in the blind zones of the bench and onsite precedents) for drilling a regular blast pattern over a rock mass that is inherently heterogeneous. Real and theoretical examples of such a method are presented.