940 resultados para practical epistemology analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A sign of presence in virtual environments is that people respond to situations and events as if they were real, where response may be considered at many different levels, ranging from unconscious physiological responses through to overt behavior,emotions, and thoughts. In this paper we consider two responses that gave different indications of the onset of presence in a gradually forming environment. Two aspects of the response of people to an immersive virtual environment were recorded: their eye scanpath, and their skin conductance response (SCR). The scenario was formed over a period of 2 min, by introducing an increasing number of its polygons in random order in a head-tracked head-mounted display. For one group of experimental participants (n 8) the environment formed into one in which they found themselves standing on top of a 3 m high column. For a second group of participants (n 6) the environment was otherwise the same except that the column was only 1 cm high, so that they would be standing at normal ground level. For a third group of participants (n 14) the polygons never formed into a meaningful environment. The participants who stood on top of the tall column exhibited a significant decrease in entropy of the eye scanpath and an increase in the number of SCR by 99 s into the scenario, at a time when only 65% of the polygons had been displayed. The ground level participants exhibited a similar decrease in scanpath entropy, but not the increase in SCR. The random scenario grouping did not exhibit this decrease in eye scanpath entropy. A drop in scanpath entropy indicates that the environment had cohered into a meaningful perception. An increase in the rate of SCR indicates the perception of an aversive stimulus. These results suggest that on these two dimensions (scanpath entropy and rate of SCR) participants were responding realistically to the scenario shown in the virtual environment. In addition, the response occurred well before the entire scenario had been displayed, suggesting that once a set of minimal cues exists within a scenario,it is enough to form a meaningful perception. Moreover, at the level of the sympathetic nervous system, the participants who were standing on top of the column exhibited arousal as if their experience might be real. This is an important practical aspect of the concept of presence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Työn tavoitteena oli selventää Elcoteqin Suomen ja Venäjän välisten yksiköiden logistista prosessia ja erityisesti siihen liittyviä ongelmia. Työ on tehty Elcoteqin Engineering Services yksikköön, jossa ei tyypillisesti ole ollut asiaan liittyvää tietoa. Pääasiallisena työmenetelmänä olivat haastattelut sekä tutustuminen logistiseen putkeen käytännössä. Myös kirjallisuudella oli oma osansa, sekä tullaukseen että asiaan liittyvien teorioiden osalta. Teorioita ja käytännön kokemuksia yhdistäen edettiin analysointiin ja vertailuun, jonka perusteella annettiin suosituksia tulevia projekteja silmällä pitäen. Myös tulevaisuuden odotuksia on käsitelty yleisellä tasolla, lähinnä liittyen erilaisten kehitysohjelmien tarjoamiin mahdollisuuksiin. Logistiikka Suomen ja Venäjän välillä on huomattavasti monimutkaisempaa kuin ensisilmäyksellä näyttää. Ongelmia on niin rajanylityksen, tullauksen kuin lisenssiprosessinkin kanssa. Asioiden kehittämiseksi on kuitenkin paljon tehtävissä, kehityspotentiaali on valtava. Vahvaa panostusta liiketoiminnan kehittämiseksi Venäjällä on jatkettava, vaikka siihen liittyvät ongelmat välillä tuntuvatkin ylivoimaisen suurilta. Erityisen turhauttavia ovat asiat, joihin ei pystytä vaikuttamaan. Työn edetessä tuli kuitenkin esiin uusia näkökulmia, joita ei aiemmin ole otettu huomioon. Tuotteiden valintaan on jatkossa kiinnitettävä enemmän huomiota. Silti myös kuljetuksen ja rajanylityksen nopeuttamisen tutkimista on edelleen jatkettava.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämä työ luo katsauksen ajallisiin ja stokastisiin ohjelmien luotettavuus malleihin sekä tutkii muutamia malleja käytännössä. Työn teoriaosuus sisältää ohjelmien luotettavuuden kuvauksessa ja arvioinnissa käytetyt keskeiset määritelmät ja metriikan sekä varsinaiset mallien kuvaukset. Työssä esitellään kaksi ohjelmien luotettavuusryhmää. Ensimmäinen ryhmä ovat riskiin perustuvat mallit. Toinen ryhmä käsittää virheiden ”kylvöön” ja merkitsevyyteen perustuvat mallit. Työn empiirinen osa sisältää kokeiden kuvaukset ja tulokset. Kokeet suoritettiin käyttämällä kolmea ensimmäiseen ryhmään kuuluvaa mallia: Jelinski-Moranda mallia, ensimmäistä geometrista mallia sekä yksinkertaista eksponenttimallia. Kokeiden tarkoituksena oli tutkia, kuinka syötetyn datan distribuutio vaikuttaa mallien toimivuuteen sekä kuinka herkkiä mallit ovat syötetyn datan määrän muutoksille. Jelinski-Moranda malli osoittautui herkimmäksi distribuutiolle konvergaatio-ongelmien vuoksi, ensimmäinen geometrinen malli herkimmäksi datan määrän muutoksille.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies evaluation of software development practices through an error analysis. The work presents software development process, software testing, software errors, error classification and software process improvement methods. The practical part of the work presents results from the error analysis of one software process. It also gives improvement ideas for the project. It was noticed that the classification of the error data was inadequate in the project. Because of this it was impossible to use the error data effectively. With the error analysis we were able to show that there were deficiencies in design and analyzing phases, implementation phase and in testing phase. The work gives ideas for improving error classification and for software development practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2010 marks the hundredth anniversary of the death of Léon Walras, the brilliant originator and first formaliser of general equilibrium theory - one of the pillars of modern economic theory. In advancing much derided practical solutions Walras also displayed more concern for the problems of living in a second best world than is common in modern pure theories of the invisible hand, efficient market hypothesis, DSGE macroeconomics or the thinking of some contemporary free market admirers all based on general equilibrium theory. This book brings contributions from the likes of Kenneth Arrow, Alan Kirman, Richard Posner, Amartya Sen and Robert Solow to share their thoughts and reflections on the theoretical heritage of Léon Walras. Some authors reminisce on the part they played in the development of modern general economics theory; others reflect on the crucial part played by general equilibrium in the development of macroeconomics, microeconomics, growth theory, welfare economics and the theory of justice; others still complain about the wrong path economic theory took under the influence of post 1945 developments in general equilibrium theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this book, I apply a philosophical approach to study the precautionary principle in environmental (and health) risk decision-making. The principle says that unacceptable environmental and health risks should be anticipated, and they ought to be forestalled before the damage comes to fruition even if scientific understanding of the risks is inadequate. The study consists of introductory chapters, summary and seven original publications which aim at explicating the principle, critically analysing the debate on the principle, and constructing a basis for the well-founded use of the principle. Papers I-V present the main thesis of this research. In the two last papers, the discussion is widened to new directions. The starting question is how well the currently embraced precautionary principle stands up to critical philosophical scrutiny. The approach employed is analytical: mainly conceptual, argumentative and ethical. The study draws upon Anglo-American style philosophy on the one hand, and upon sources of law as well as concrete cases and decision-making practices at the European Union level and in its member countries on the other. The framework is environmental (and health) risk governance, including the related law and policy. The main thesis of this study is that the debate on the precautionary principle needs to be shifted from the question of whether the principle (or its weak or strong interpretation) is well-grounded in general to questions about the theoretical plausibility and ethical and socio-political justifiability of specific understandings of the principle. The real picture of the precautionary principle is more complex than that found (i.e. presumed) in much of the current academic, political and public debate surrounding it. While certain presumptions and interpretations of the principle are found to be sound, others are theoretically flawed or include serious practical problems. The analysis discloses conceptual and ethical presumptions and elementary understandings of the precautionary principle, critically assesses current practices invoked in the name of the precautionary principle and public participation, and seeks to build bridges between precaution, engagement and philosophical ethics. Hence, it is intended to provide a sound basis upon which subsequent academic scrutiny can build.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The purpose of this study is to validate the Pulvers silhouette showcard as a measure of weight status in a population in the African region. This tool is particularly beneficial when scarce resources do not allow for direct anthropometric measurements due to limited survey time or lack of measurement technology in face-to-face general-purpose surveys or in mailed, online, or mobile device-based surveys. METHODS: A cross-sectional study was conducted in the Republic of Seychelles with a sample of 1240 adults. We compared self-reported body sizes measured by Pulvers' silhouette showcards to four measurements of body size and adiposity: body mass index (BMI), body fat percent measured, waist circumference, and waist to height ratio. The accuracy of silhouettes as an obesity indicator was examined using sex-specific receiver operator curve (ROC) analysis and the reliability of this tool to detect socioeconomic gradients in obesity was compared to BMI-based measurements. RESULTS: Our study supports silhouette body size showcards as a valid and reliable survey tool to measure self-reported body size and adiposity in an African population. The mean correlation coefficients of self-reported silhouettes with measured BMI were 0.80 in men and 0.81 in women (P < 0.001). The silhouette showcards also showed high accuracy for detecting obesity as per a BMI ≥ 30 (Area under curve, AUC: 0.91/0.89, SE: 0.01), which was comparable to other measured adiposity indicators: fat percent (AUC: 0.94/0.94, SE: 0.01), waist circumference (AUC: 0.95/0.94, SE: 0.01), and waist to height ratio (AUC: 0.95/0.94, SE: 0.01) amongst men and women, respectively. The use of silhouettes in detecting obesity differences among different socioeconomic groups resulted in similar magnitude, direction, and significance of association between obesity and socioeconomic status as when using measured BMI. CONCLUSIONS: This study highlights the validity and reliability of silhouettes as a survey tool for measuring obesity in a population in the African region. The ease of use and cost-effectiveness of this tool makes it an attractive alternative to measured BMI in the design of non-face-to-face online- or mobile device-based surveys as well as in-person general-purpose surveys of obesity in social sciences, where limited resources do not allow for direct anthropometric measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

User retention is a major goal for higher education institutions running their teaching and learning programmes online. This is the first investigation into how the senses of presence and flow, together with perceptions about two central elements of the virtual education environment (didactic resource quality and instructor attitude), facilitate the user¿s intention to continue e-learning. We use data collected from a large sample survey of current users in a pure e-learning environment along with objective data about their performance. The results provide support to the theoretical model. The paper further offers practical suggestions for institutions and instructors who aim to provide effective e-learning experiences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data transmission between an electric motor and a frequency converter is required in variablespeed electric drives because of sensors installed at the motor. Sensor information can be used for various useful applications to improve the system reliability and its properties. Traditionally, the communication medium is implemented by an additional cabling. However, the costs of the traditional method may be an obstacle to the wider application of data transmission between a motor and a frequency converter. In any case, a power cable is always installed between a motor and a frequency converter for power supply, and hence it may be applied as a communication medium for sensor level data. This thesis considers power line communication (PLC) in inverter-fed motor power cables. The motor cable is studied as a communication channel in the frequency band of 100 kHz−30 MHz. The communication channel and noise characteristics are described. All the individual components included in a variable-speed electric drive are presented in detail. A channel model is developed, and it is verified by measurements. A theoretical channel information capacity analysis is carried out to estimate the opportunities of a communication medium. Suitable communication and forward error correction (FEC) methods are suggested. A general method to implement a broadband and Ethernet-based communication medium between a motor and a frequency converter is proposed. A coupling interface is also developed that allows to install the communication device safely to a three-phase inverter-fed motor power cable. Practical tests are carried out, and the results are analyzed. Possible applications for the proposed method are presented. A speed feedback motor control application is verified in detail by simulations and laboratory tests because of restrictions for the delay in the feedback loop caused by PLC. Other possible applications are discussed at a more general level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increasing interest in the use of breath analysis for monitoring human physiology and exposure to toxic substances or environmental pollutants. This review focuses on the current status of the sampling procedures, collection devices and sample-enrichment methodologies used for exhaled breath-vapor analysis. We discuss the different parameters affecting each of the above steps, taking into account the requirements for breath analysis in exposure assessments and the need to analyze target compounds at sub-ppbv levels. Finally, we summarize the practical applications of exposure analysis in the past two decades

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mimicry is a central plank of the emotional contagion theory; however, it was only tested with facial and postural emotional stimuli. This study explores the existence of mimicry in voice-to-voice communication by analyzing 8,747 sequences of emotional displays between customers and employees in a call-center context. We listened live to 967 telephone inter-actions, registered the sequences of emotional displays, and analyzed them with a Markov chain. We also explored other propositions of emotional contagion theory that were yet to be tested in vocal contexts. Results supported that mimicry is significantly present at all levels. Our findings fill an important gap in the emotional contagion theory; have practical implications regarding voice-to-voice interactions; and open doors for future vocal mimicry research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The uncertainty of any analytical determination depends on analysis and sampling. Uncertainty arising from sampling is usually not controlled and methods for its evaluation are still little known. Pierre Gy’s sampling theory is currently the most complete theory about samplingwhich also takes the design of the sampling equipment into account. Guides dealing with the practical issues of sampling also exist, published by international organizations such as EURACHEM, IUPAC (International Union of Pure and Applied Chemistry) and ISO (International Organization for Standardization). In this work Gy’s sampling theory was applied to several cases, including the analysis of chromite concentration estimated on SEM (Scanning Electron Microscope) images and estimation of the total uncertainty of a drug dissolution procedure. The results clearly show that Gy’s sampling theory can be utilized in both of the above-mentioned cases and that the uncertainties achieved are reliable. Variographic experiments introduced in Gy’s sampling theory are beneficially applied in analyzing the uncertainty of auto-correlated data sets such as industrial process data and environmental discharges. The periodic behaviour of these kinds of processes can be observed by variographic analysis as well as with fast Fourier transformation and auto-correlation functions. With variographic analysis, the uncertainties are estimated as a function of the sampling interval. This is advantageous when environmental data or process data are analyzed as it can be easily estimated how the sampling interval is affecting the overall uncertainty. If the sampling frequency is too high, unnecessary resources will be used. On the other hand, if a frequency is too low, the uncertainty of the determination may be unacceptably high. Variographic methods can also be utilized to estimate the uncertainty of spectral data produced by modern instruments. Since spectral data are multivariate, methods such as Principal Component Analysis (PCA) are needed when the data are analyzed. Optimization of a sampling plan increases the reliability of the analytical process which might at the end have beneficial effects on the economics of chemical analysis,

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS), an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study presents an automatic, computer-aided analytical method called Comparison Structure Analysis (CSA), which can be applied to different dimensions of music. The aim of CSA is first and foremost practical: to produce dynamic and understandable representations of musical properties by evaluating the prevalence of a chosen musical data structure through a musical piece. Such a comparison structure may refer to a mathematical vector, a set, a matrix or another type of data structure and even a combination of data structures. CSA depends on an abstract systematic segmentation that allows for a statistical or mathematical survey of the data. To choose a comparison structure is to tune the apparatus to be sensitive to an exclusive set of musical properties. CSA settles somewhere between traditional music analysis and computer aided music information retrieval (MIR). Theoretically defined musical entities, such as pitch-class sets, set-classes and particular rhythm patterns are detected in compositions using pattern extraction and pattern comparison algorithms that are typical within the field of MIR. In principle, the idea of comparison structure analysis can be applied to any time-series type data and, in the music analytical context, to polyphonic as well as homophonic music. Tonal trends, set-class similarities, invertible counterpoints, voice-leading similarities, short-term modulations, rhythmic similarities and multiparametric changes in musical texture were studied. Since CSA allows for a highly accurate classification of compositions, its methods may be applicable to symbolic music information retrieval as well. The strength of CSA relies especially on the possibility to make comparisons between the observations concerning different musical parameters and to combine it with statistical and perhaps other music analytical methods. The results of CSA are dependent on the competence of the similarity measure. New similarity measures for tonal stability, rhythmic and set-class similarity measurements were proposed. The most advanced results were attained by employing the automated function generation – comparable with the so-called genetic programming – to search for an optimal model for set-class similarity measurements. However, the results of CSA seem to agree strongly, independent of the type of similarity function employed in the analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cutin and suberin are structural and protective polymers of plant surfaces. The epidermal cells of the aerial parts of plants are covered with an extracellular cuticular layer, which consists of polyester cutin, highly resistant cutan, cuticular waxes and polysaccharides which link the layer to the epidermal cells. A similar protective layer is formed by a polyaromatic-polyaliphatic biopolymer suberin, which is present particularly in the cell walls of the phellem layer of periderm of the underground parts of plants (e.g. roots and tubers) and the bark of trees. In addition, suberization is also a major factor in wound healing and wound periderm formation regardless of the plants’ tissue. Knowledge of the composition and functions of cuticular and suberin polymers is important for understanding the physiological properties for the plants and for nutritional quality when these plants are consumed as foods. The aims of the practical work were to assess the chemical composition of cuticular polymers of several northern berries and seeds and suberin of two varieties of potatoes. Cutin and suberin were studied as isolated polymers and further after depolymerization as soluble monomers and solid residues. Chemical and enzymatic depolymerization techniques were compared and a new chemical depolymerization method was developed. Gas chromatographic analysis with mass spectrometric detection (GC-MS) was used to assess the monomer compositions. Polymer investigations were conducted with solid state carbon-13 cross polarization magic angle spinning nuclear magnetic resonance spectroscopy (13C CP-MAS NMR), Fourier transform infrared spectroscopy (FTIR) and microscopic analysis. Furthermore, the development of suberin over one year of post-harvest storage was investigated and the cuticular layers from berries grown in the North and South of Finland were compared. The results show that the amounts of isolated cuticular layers and cutin monomers, as well as monomeric compositions vary greatly between the berries. The monomer composition of seeds was found to differ from the corresponding berry peel monomers. The berry cutin monomers were composed mostly of long-chain aliphatic ω-hydroxy acids, with various mid-chain functionalities (double-bonds, epoxy, hydroxy and keto groups). Substituted α,ω-diacids predominated over ω-hydroxy acids in potato suberin monomers and slight differences were found between the varieties. The newly-developed closed tube chemical method was found to be suitable for cutin and suberin analysis and preferred over the solvent-consuming and laborious reflux method. Enzymatic hydrolysis with cutinase was less effective than chemical methanolysis and showed specificity towards α,ω-diacid bonds. According to 13C CP-MAS NMR and FTIR, the depolymerization residues contained significant amounts of aromatic structures, polysaccharides and possible cutan-type aliphatic moieties. Cultivation location seems to have effect on cuticular composition. The materials studied contained significant amounts of different types of biopolymers that could be utilized for several purposes with or without further processing. The importance of the so-called waste material from industrial processes of berries and potatoes as a source of either dietary fiber or specialty chemicals should be further investigated in detail. The evident impact of cuticular and suberin polymers, among other fiber components, on human health should be investigated in clinical trials. These by-product materials may be used as value-added fiber fractions in the food industry and as raw materials for specialty chemicals such as lubricants and emulsifiers, or as building blocks for novel polymers.