925 resultados para Model basic science research
Resumo:
Милослав A. Средков - Понятията модел и моделиране се използват толкова интензивно в много дисциплини, че е трудно да им се придаде конретно значение. Дори в Софтуерните технологии, разбирането на тези понятия силно зависи от контекста. Ние смятаме, че това допринася за неконсистентността между подходите за моделиране там. В тази статия посочваме някои от произтичащите проблеми, както и важността да има подходяща дефиниция, съчетана с подходящи инструменти. Преглеждаме по-общите дефиниции на модел отвъд границите на Софтуерните технологии. Минаваме през моделирането на различни вторични продукти на процесите от Софтуерните технологии. Накрая представяме нашата визия относно използването на такава обща основа за наша полза.
Resumo:
In support of research in the debate concerning its relevance to hospitality academics and practitioners, the author presents a discussion of how the philosophy of science impacts approaches to research, including a brief summary of empiricism, and the importance of the triangulation of research orientations. Criticism of research is the hospitality literature often focuses on the lack of an apparent philosophy of science perspective and how this perspective impacts the way in which scholars conduct and interpret research. The Validity Network Schema (VNS) presents a triangulation model for evaluating research progress in a discipline by providing a mechanism for integrating academic and practitioner research studies.
Resumo:
Surveys can collect important data that inform policy decisions and drive social science research. Large government surveys collect information from the U.S. population on a wide range of topics, including demographics, education, employment, and lifestyle. Analysis of survey data presents unique challenges. In particular, one needs to account for missing data, for complex sampling designs, and for measurement error. Conceptually, a survey organization could spend lots of resources getting high-quality responses from a simple random sample, resulting in survey data that are easy to analyze. However, this scenario often is not realistic. To address these practical issues, survey organizations can leverage the information available from other sources of data. For example, in longitudinal studies that suffer from attrition, they can use the information from refreshment samples to correct for potential attrition bias. They can use information from known marginal distributions or survey design to improve inferences. They can use information from gold standard sources to correct for measurement error.
This thesis presents novel approaches to combining information from multiple sources that address the three problems described above.
The first method addresses nonignorable unit nonresponse and attrition in a panel survey with a refreshment sample. Panel surveys typically suffer from attrition, which can lead to biased inference when basing analysis only on cases that complete all waves of the panel. Unfortunately, the panel data alone cannot inform the extent of the bias due to attrition, so analysts must make strong and untestable assumptions about the missing data mechanism. Many panel studies also include refreshment samples, which are data collected from a random sample of new
individuals during some later wave of the panel. Refreshment samples offer information that can be utilized to correct for biases induced by nonignorable attrition while reducing reliance on strong assumptions about the attrition process. To date, these bias correction methods have not dealt with two key practical issues in panel studies: unit nonresponse in the initial wave of the panel and in the
refreshment sample itself. As we illustrate, nonignorable unit nonresponse
can significantly compromise the analyst's ability to use the refreshment samples for attrition bias correction. Thus, it is crucial for analysts to assess how sensitive their inferences---corrected for panel attrition---are to different assumptions about the nature of the unit nonresponse. We present an approach that facilitates such sensitivity analyses, both for suspected nonignorable unit nonresponse
in the initial wave and in the refreshment sample. We illustrate the approach using simulation studies and an analysis of data from the 2007-2008 Associated Press/Yahoo News election panel study.
The second method incorporates informative prior beliefs about
marginal probabilities into Bayesian latent class models for categorical data.
The basic idea is to append synthetic observations to the original data such that
(i) the empirical distributions of the desired margins match those of the prior beliefs, and (ii) the values of the remaining variables are left missing. The degree of prior uncertainty is controlled by the number of augmented records. Posterior inferences can be obtained via typical MCMC algorithms for latent class models, tailored to deal efficiently with the missing values in the concatenated data.
We illustrate the approach using a variety of simulations based on data from the American Community Survey, including an example of how augmented records can be used to fit latent class models to data from stratified samples.
The third method leverages the information from a gold standard survey to model reporting error. Survey data are subject to reporting error when respondents misunderstand the question or accidentally select the wrong response. Sometimes survey respondents knowingly select the wrong response, for example, by reporting a higher level of education than they actually have attained. We present an approach that allows an analyst to model reporting error by incorporating information from a gold standard survey. The analyst can specify various reporting error models and assess how sensitive their conclusions are to different assumptions about the reporting error process. We illustrate the approach using simulations based on data from the 1993 National Survey of College Graduates. We use the method to impute error-corrected educational attainments in the 2010 American Community Survey using the 2010 National Survey of College Graduates as the gold standard survey.
Resumo:
In Marxist frameworks “distributive justice” depends on extracting value through a centralized state. Many new social movements—peer to peer economy, maker activism, community agriculture, queer ecology, etc.—take the opposite approach, keeping value in its unalienated form and allowing it to freely circulate from the bottom up. Unlike Marxism, there is no general theory for bottom-up, unalienated value circulation. This paper examines the concept of “generative justice” through an historical contrast between Marx’s writings and the indigenous cultures that he drew upon. Marx erroneously concluded that while indigenous cultures had unalienated forms of production, only centralized value extraction could allow the productivity needed for a high quality of life. To the contrary, indigenous cultures now provide a robust model for the “gift economy” that underpins open source technological production, agroecology, and restorative approaches to civil rights. Expanding Marx’s concept of unalienated labor value to include unalienated ecological (nonhuman) value, as well as the domain of freedom in speech, sexual orientation, spirituality and other forms of “expressive” value, we arrive at an historically informed perspective for generative justice.
Resumo:
This article presents an interdisciplinary experience that brings together two areas of computer science; didactics and philosophy. As such, the article introduces a relatively unexplored area of research, not only in Uruguay but in the whole Latin American region. The reflection on the ontological status of computer science, its epistemic and educational problems, as well as their relationship with technology, allows us to elaborate a critical analysis of the discipline and a social perception of it as a basic science.
Resumo:
Virtual-build-to-order (VBTO) is a form of order fulfilment system in which the producer has the ability to search across the entire pipeline of finished stock, products in production and those in the production plan, in order to find the best product for a customer. It is a system design that is attractive to Mass Customizers, such as those in the automotive sector, whose manufacturing lead time exceeds their customers' tolerable waiting times, and for whom the holding of partly-finished stocks at a fixed decoupling point is unattractive or unworkable. This paper describes and develops the operational concepts that underpin VBTO, in particular the concepts of reconfiguration flexibility and customer aversion to waiting. Reconfiguration is the process of changing a product's specification at any point along the order fulfilment pipeline. The extent to which an order fulfilment system is flexible or inflexible reveals itself in the reconfiguration cost curve, of which there are four basic types. The operational features of the generic VBTO system are described and simulation is used to study its behaviour and performance. The concepts of reconfiguration flexibility and floating decoupling point are introduced and discussed.
Resumo:
Nowadays, risks arising from the rapid development of oil and gas industries are significantly increasing. As a result, one of the main concerns of either industrial or environmental managers is the identification and assessment of such risks in order to develop and maintain appropriate proactive measures. Oil spill from stationary sources in offshore zones is one of the accidents resulting in several adverse impacts on marine ecosystems. Considering a site's current situation and relevant requirements and standards, risk assessment process is not only capable of recognizing the probable causes of accidents but also of estimating the probability of occurrence and the severity of consequences. In this way, results of risk assessment would help managers and decision makers create and employ proper control methods. Most of the represented models for risk assessment of oil spills are achieved on the basis of accurate data bases and analysis of historical data, but unfortunately such data bases are not accessible in most of the zones, especially in developing countries, or else they are newly established and not applicable yet. This issue reveals the necessity of using Expert Systems and Fuzzy Set Theory. By using such systems it will be possible to formulize the specialty and experience of several experts and specialists who have been working in petroliferous areas for several years. On the other hand, in developing countries often the damages to environment and environmental resources are not considered as risk assessment priorities and they are approximately under-estimated. For this reason, the proposed model in this research is specially addressing the environmental risk of oil spills from stationary sources in offshore zones.
Resumo:
This article analyzes Boys in white: student culture in medical schoolby Howard S. Becker, Blanche Geer, Everett C. Hughes and Anselm Strauss, considered a model of qualitative research in sociology. The analysis investigates the trajectories of the authors, the book, qualitative analysis, and the medical students, emphasizing their importance in the origins of medical sociology and the sociology of medical education. In the trajectory of the authors, bibliographical information is given. The trajectory of qualitative research focuses on how this methodology influences the construction of the field. The investigation of the students' trajectory shows how they progress through their first years at medical school to build their own student culture.
Resumo:
In Brazil, the study of pedestrian-induced vibration on footbridges has been undertaken since the early 1990s, for concrete and steel footbridges. However, there are no recorded studies of this kind for timber footbridges. Brazilian code ABNT NBR 7190 (1997) gives design requirements only for static loads in the case of timber footbridges, without considering the serviceability limit state from pedestrian-induced vibrations. The aim of this work is to perform a theoretical dynamic, numerical and experimental analysis on simply-supported timber footbridges, by using a small-scale model developed from a 24 m span and 2 m width timber footbridge, with two main timber beams. Span and width were scaled down (1:4) to 6 m e 0.5 in, respectively. Among the conclusions reached herein, it is emphasized that the Euler-Bernoulli beam theory is suitable for calculating the vertical and lateral first natural frequencies in simply-supported timber footbridges; however, special attention should be given to the evaluation of lateral bending stiffness, as it leads to conservative values.
Resumo:
Context. Precise S abundances are important in the study of the early chemical evolution of the Galaxy. In particular the site of the formation remains uncertain because, at low metallicity, the trend of this alpha-element versus [Fe/H] remains unclear. Moreover, although sulfur is not bound significantly in dust grains in the ISM, it seems to behave differently in DLAs and old metal-poor stars. Aims. We attempt a precise measurement of the S abundance in a sample of extremely metal-poor stars observed with the ESO VLT equipped with UVES, taking into account NLTE and 3D effects. Methods. The NLTE profiles of the lines of multiplet 1 of S I were computed with a version of the program MULTI, including opacity sources from ATLAS9 and based on a new model atom for S. These profiles were fitted to the observed spectra. Results. We find that sulfur in EMP stars behaves like the other alpha-elements, with [S/Fe] remaining approximately constant below [Fe/H] = -3. However, [S/Mg] seems to decrease slightly with increasing [Mg/H]. The overall abundance patterns of O, Na, Mg, Al, S, and K are most closely matched by the SN model yields by Heger & Woosley. The [S/Zn] ratio in EMP stars is solar, as also found in DLAs. We derive an upper limit to the sulfur abundance [S/Fe] < +0.5 for the ultra metal-poor star CS 22949-037. This, along with a previously reported measurement of zinc, argues against the conjecture that the light-element abundance pattern of this star (and by analogy, the hyper iron-poor stars HE 0107-5240 and HE 1327-2326) would be due to dust depletion.
Resumo:
Context. The chemical composition of extremely metal-poor stars (EMP stars; [Fe/H] < similar to -3) is a unique tracer of early nucleosynthesis in the Galaxy. As such stars are rare, we wish to find classes of luminous stars which can be studied at high spectral resolution. Aims. We aim to determine the detailed chemical composition of the two EMP stars CS 30317-056 and CS 22881-039, originally thought to be red horizontal-branch (RHB) stars, and compare it to earlier results for EMP stars as well as to nucleosynthesis yields from various supernova (SN) models. In the analysis, we discovered that our targets are in fact the two most metal-poor RR Lyrae stars known. Methods. Our detailed abundance analysis, taking into account the variability of the stars, is based on VLT/UVES spectra (R similar or equal to 43 000) and 1D LTE OSMARCS model atmospheres and synthetic spectra. For comparison with SN models we also estimate NLTE corrections for a number of elements. Results. We derive LTE abundances for the 16 elements O, Na, Mg, Al, Si, S, Ca, Sc, Ti, Cr, Mn, Fe, Co, Ni, Sr and Ba, in good agreement with earlier values for EMP dwarf, giant and RHB stars. Li and C are not detected in either star. NLTE abundance corrections are newly calculated for O and Mg and taken from the literature for other elements. The resulting abundance pattern is best matched by model yields for supernova explosions with high energy and/or significant asphericity effects. Conclusions. Our results indicate that, except for Li and C, the surface composition of EMP RR Lyr stars is not significantly affected by mass loss, mixing or diffusion processes; hence, EMP RR Lyr stars should also be useful tracers of the chemical evolution of the early Galactic halo. The observed abundance ratios indicate that these stars were born from an ISM polluted by energetic, massive (25-40 M(circle dot)) and/or aspherical supernovae, but the NLTE corrections for Sc and certain other elements do play a role in the choice of model.
Resumo:
Context. The detailed chemical abundances of extremely metal-poor (EMP) stars are key guides to understanding the early chemical evolution of the Galaxy. Most existing data, however, treat giant stars that may have experienced internal mixing later. Aims. We aim to compare the results for giants with new, accurate abundances for all observable elements in 18 EMP turno. stars. Methods. VLT/UVES spectra at R similar to 45 000 and S/N similar to 130 per pixel (lambda lambda 330-1000 nm) are analysed with OSMARCS model atmospheres and the TURBOSPECTRUM code to derive abundances for C, Mg, Si, Ca, Sc, Ti, Cr, Mn, Co, Ni, Zn, Sr, and Ba. Results. For Ca, Ni, Sr, and Ba, we find excellent consistency with our earlier sample of EMP giants, at all metallicities. However, our abundances of C, Sc, Ti, Cr, Mn and Co are similar to 0.2 dex larger than in giants of similar metallicity. Mg and Si abundances are similar to 0.2 dex lower (the giant [Mg/Fe] values are slightly revised), while Zn is again similar to 0.4 dex higher than in giants of similar [Fe/H] (6 stars only). Conclusions. For C, the dwarf/giant discrepancy could possibly have an astrophysical cause, but for the other elements it must arise from shortcomings in the analysis. Approximate computations of granulation (3D) effects yield smaller corrections for giants than for dwarfs, but suggest that this is an unlikely explanation, except perhaps for C, Cr, and Mn. NLTE computations for Na and Al provide consistent abundances between dwarfs and giants, unlike the LTE results, and would be highly desirable for the other discrepant elements as well. Meanwhile, we recommend using the giant abundances as reference data for Galactic chemical evolution models.
Resumo:
Leakage reduction in water supply systems and distribution networks has been an increasingly important issue in the water industry since leaks and ruptures result in major physical and economic losses. Hydraulic transient solvers can be used in the system operational diagnosis, namely for leak detection purposes, due to their capability to describe the dynamic behaviour of the systems and to provide substantial amounts of data. In this research work, the association of hydraulic transient analysis with an optimisation model, through inverse transient analysis (ITA), has been used for leak detection and its location in an experimental facility containing PVC pipes. Observed transient pressure data have been used for testing ITA. A key factor for the success of the leak detection technique used is the accurate calibration of the transient solver, namely adequate boundary conditions and the description of energy dissipation effects since PVC pipes are characterised by a viscoelastic mechanical response. Results have shown that leaks were located with an accuracy between 4-15% of the total length of the pipeline, depending on the discretisation of the system model.
Resumo:
In this paper, a review of the thermodynamic approaches of sliding wear is presented. These approaches are divided into the friction energy dissipation, the energy balance and the entropy production. A concise and critic account of the approaches is discussed, remarking their relative strength and weakness in explaining the phenomena occurring in the sliding wear.
Resumo:
Estimation of Taylor`s power law for species abundance data may be performed by linear regression of the log empirical variances on the log means, but this method suffers from a problem of bias for sparse data. We show that the bias may be reduced by using a bias-corrected Pearson estimating function. Furthermore, we investigate a more general regression model allowing for site-specific covariates. This method may be efficiently implemented using a Newton scoring algorithm, with standard errors calculated from the inverse Godambe information matrix. The method is applied to a set of biomass data for benthic macrofauna from two Danish estuaries. (C) 2011 Elsevier B.V. All rights reserved.