944 resultados para Objects
Resumo:
The paper spells out five different accounts of the relationship between objects and relations three of which are versions of ontic structural realism (OSR). We argue that the distinction between objects and properties, including relations, is merely a conceptual one by contrast to an ontological one: properties, including relations, are modes, that is the concrete, particular ways in which objects exist. We then set out moderate OSR as the view according to which irreducible relations are central ways in which the fundamental physical objects exist. Physical structures thus consist in objects for whom it is essential that they are related in certain ways. There hence are objects, but they do not possess an intrinsic identity. This view can also admit intrinsic properties as ways in which objects exist provided that these do not amount to identity conditions for the objects. Finally, we indicate how this view can take objective modality into account.
Resumo:
The purpose of this study was to assess the knowledge and attitude of health care professionals regarding their use of universal precaution measures at a public emergency service. The study also aimed to assess the rates of occupational accidents involving biological substances among those workers. This study was performed with 238 workers, from June to November 2006, using univariate and multivariate analysis. The chance of not adopting precaution measures was 20.7 (95% CI: 5.68 - 75.14) times greater among drivers compared to physicians. No significant association was found between adopting universal precaution measures. The occupational accident rate was 20.6% (40.8% involving sharp-edged objects). The risk of physicians having an occupational accident was 2.7(95% CI: 1.05 - 7.09) times higher than that of drivers. The fact that a staff member had adequate knowledge about universal precaution measures was insufficient to foster compatible attitudes towards reducing the risk of transmitting infectious agents and causing occupational accidents.
Resumo:
The project aims at advancing the state of the art in the use of context information for classification of image and video data. The use of context in the classification of images has been showed of great importance to improve the performance of actual object recognition systems. In our project we proposed the concept of Multi-scale Feature Labels as a general and compact method to exploit the local and global context. The feature extraction from the discriminative probability or classification confidence label field is of great novelty. Moreover the use of a multi-scale representation of the feature labels lead to a compact and efficient description of the context. The goal of the project has been also to provide a general-purpose method and prove its suitability in different image/video analysis problem. The two-year project generated 5 journal publications (plus 2 under submission), 10 conference publications (plus 2 under submission) and one patent (plus 1 pending). Of these publications, a relevant number make use of the main result of this project to improve the results in detection and/or segmentation of objects.
Resumo:
Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.
Resumo:
Introduction: Responses to external stimuli are typically investigated by averaging peri-stimulus electroencephalography (EEG) epochs in order to derive event-related potentials (ERPs) across the electrode montage, under the assumption that signals that are related to the external stimulus are fixed in time across trials. We demonstrate the applicability of a single-trial model based on patterns of scalp topographies (De Lucia et al, 2007) that can be used for ERP analysis at the single-subject level. The model is able to classify new trials (or groups of trials) with minimal a priori hypotheses, using information derived from a training dataset. The features used for the classification (the topography of responses and their latency) can be neurophysiologically interpreted, because a difference in scalp topography indicates a different configuration of brain generators. An above chance classification accuracy on test datasets implicitly demonstrates the suitability of this model for EEG data. Methods: The data analyzed in this study were acquired from two separate visual evoked potential (VEP) experiments. The first entailed passive presentation of checkerboard stimuli to each of the four visual quadrants (hereafter, "Checkerboard Experiment") (Plomp et al, submitted). The second entailed active discrimination of novel versus repeated line drawings of common objects (hereafter, "Priming Experiment") (Murray et al, 2004). Four subjects per experiment were analyzed, using approx. 200 trials per experimental condition. These trials were randomly separated in training (90%) and testing (10%) datasets in 10 independent shuffles. In order to perform the ERP analysis we estimated the statistical distribution of voltage topographies by a Mixture of Gaussians (MofGs), which reduces our original dataset to a small number of representative voltage topographies. We then evaluated statistically the degree of presence of these template maps across trials and whether and when this was different across experimental conditions. Based on these differences, single-trials or sets of a few single-trials were classified as belonging to one or the other experimental condition. Classification performance was assessed using the Receiver Operating Characteristic (ROC) curve. Results: For the Checkerboard Experiment contrasts entailed left vs. right visual field presentations for upper and lower quadrants, separately. The average posterior probabilities, indicating the presence of the computed template maps in time and across trials revealed significant differences starting at ~60-70 ms post-stimulus. The average ROC curve area across all four subjects was 0.80 and 0.85 for upper and lower quadrants, respectively and was in all cases significantly higher than chance (unpaired t-test, p<0.0001). In the Priming Experiment, we contrasted initial versus repeated presentations of visual object stimuli. Their posterior probabilities revealed significant differences, which started at 250ms post-stimulus onset. The classification accuracy rates with single-trial test data were at chance level. We therefore considered sub-averages based on five single trials. We found that for three out of four subjects' classification rates were significantly above chance level (unpaired t-test, p<0.0001). Conclusions: The main advantage of the present approach is that it is based on topographic features that are readily interpretable along neurophysiologic lines. As these maps were previously normalized by the overall strength of the field potential on the scalp, a change in their presence across trials and between conditions forcibly reflects a change in the underlying generator configurations. The temporal periods of statistical difference between conditions were estimated for each training dataset for ten shuffles of the data. Across the ten shuffles and in both experiments, we observed a high level of consistency in the temporal periods over which the two conditions differed. With this method we are able to analyze ERPs at the single-subject level providing a novel tool to compare normal electrophysiological responses versus single cases that cannot be considered part of any cohort of subjects. This aspect promises to have a strong impact on both basic and clinical research.
Resumo:
BACKGROUND: Visual behavior is known to be atypical in Autism Spectrum Disorders (ASD). Monitor-based eye-tracking studies have measured several of these atypicalities in individuals with Autism. While atypical behaviors are known to be accentuated during natural interactions, few studies have been made on gaze behavior in natural interactions. In this study we focused on i) whether the findings done in laboratory settings are also visible in a naturalistic interaction; ii) whether new atypical elements appear when studying visual behavior across the whole field of view. METHODOLOGY/PRINCIPAL FINDINGS: Ten children with ASD and ten typically developing children participated in a dyadic interaction with an experimenter administering items from the Early Social Communication Scale (ESCS). The children wore a novel head-mounted eye-tracker, measuring gaze direction and presence of faces across the child's field of view. The analysis of gaze episodes to faces revealed that children with ASD looked significantly less and for shorter lapses of time at the experimenter. The analysis of gaze patterns across the child's field of view revealed that children with ASD looked downwards and made more extensive use of their lateral field of view when exploring the environment. CONCLUSIONS/SIGNIFICANCE: The data gathered in naturalistic settings confirm findings previously obtained only in monitor-based studies. Moreover, the study allowed to observe a generalized strategy of lateral gaze in children with ASD when they were looking at the objects in their environment.
Resumo:
Particle physics studies highly complex processes which cannot be directly observed. Scientific realism claims that we are nevertheless warranted in believing that these processes really occur and that the objects involved in them really exist. This dissertation defends a version of scientific realism, called causal realism, in the context of particle physics. I start by introducing the central theses and arguments in the recent philosophical debate on scientific realism (chapter 1), with a special focus on an important presupposition of the debate, namely common sense realism. Chapter 2 then discusses entity realism, which introduces a crucial element into the debate by emphasizing the importance of experiments in defending scientific realism. Most of the chapter is concerned with Ian Hacking's position, but I also argue that Nancy Cartwright's version of entity realism is ultimately preferable as a basis for further development. In chapter 3,1 take a step back and consider the question whether the realism debate is worth pursuing at all. Arthur Fine has given a negative answer to that question, proposing his natural ontologica! attitude as an alternative to both realism and antirealism. I argue that the debate (in particular the realist side of it) is in fact less vicious than Fine presents it. The second part of my work (chapters 4-6) develops, illustrates and defends causal realism. The key idea is that inference to the best explanation is reliable in some cases, but not in others. Chapter 4 characterizes the difference between these two kinds of cases in terms of three criteria which distinguish causal from theoretical warrant. In order to flesh out this distinction, chapter 5 then applies it to a concrete case from the history of particle physics, the discovery of the neutrino. This case study shows that the distinction between causal and theoretical warrant is crucial for understanding what it means to "directly detect" a new particle. But the distinction is also an effective tool against what I take to be the presently most powerful objection to scientific realism: Kyle Stanford's argument from unconceived alternatives. I respond to this argument in chapter 6, and I illustrate my response with a discussion of Jean Perrin's experimental work concerning the atomic hypothesis. In the final part of the dissertation, I turn to the specific challenges posed to realism by quantum theories. One of these challenges comes from the experimental violations of Bell's inequalities, which indicate a failure of locality in the quantum domain. I show in chapter 7 how causal realism can further our understanding of quantum non-locality by taking account of some recent experimental results. Another challenge to realism in quantum mechanics comes from delayed-choice experiments, which seem to imply that certain aspects of what happens in an experiment can be influenced by later choices of the experimenter. Chapter 8 analyzes these experiments and argues that they do not warrant the antirealist conclusions which some commentators draw from them. It pays particular attention to the case of delayed-choice entanglement swapping and the corresponding question whether entanglement is a real physical relation. In chapter 9,1 finally address relativistic quantum theories. It is often claimed that these theories are incompatible with a particle ontology, and this calls into question causal realism's commitment to localizable and countable entities. I defend the commitments of causal realism against these objections, and I conclude with some remarks connecting the interpretation of quantum field theory to more general metaphysical issues confronting causal realism.
Resumo:
In a series of three experiments, participants made inferences about which one of a pair of two objects scored higher on a criterion. The first experiment was designed to contrast the prediction of Probabilistic Mental Model theory (Gigerenzer, Hoffrage, & Kleinbölting, 1991) concerning sampling procedure with the hard-easy effect. The experiment failed to support the theory's prediction that a particular pair of randomly sampled item sets would differ in percentage correct; but the observation that German participants performed practically as well on comparisons between U.S. cities (many of which they did not even recognize) than on comparisons between German cities (about which they knew much more) ultimately led to the formulation of the recognition heuristic. Experiment 2 was a second, this time successful, attempt to unconfound item difficulty and sampling procedure. In Experiment 3, participants' knowledge and recognition of each city was elicited, and how often this could be used to make an inference was manipulated. Choices were consistent with the recognition heuristic in about 80% of the cases when it discriminated and people had no additional knowledge about the recognized city (and in about 90% when they had such knowledge). The frequency with which the heuristic could be used affected the percentage correct, mean confidence, and overconfidence as predicted. The size of the reference class, which was also manipulated, modified these effects in meaningful and theoretically important ways.
Resumo:
Why do people coordinate on the use of valueless pieces of paper as generally accepted money? A possible answer is that these objects have intrinsic properties that make them better candidates to be used as media of exchange. Another answer stresses the fact that unconvertible fiat money will not easily appear unless there is a centralized institution that favors its use. The main objective of the paper is to analyze these questions. In order to do this, we take a model of commodity money in which fiat money does not play any significant role and modify it to examine under which circumstances fiat money might come to circulate as medium of exchange. Some of the results obtained from the model differ in a rather substantial way from previous related literature.
Resumo:
ABSTRACT This thesis is composed of two main parts. The first addressed the question of whether the auditory and somatosensory systems, like their visual counterpart, comprise parallel functional pathways for processing identity and spatial attributes (so-called `what' and `where' pathways, respectively). The second part examined the independence of control processes mediating task switching across 'what' and `where' pathways in the auditory and visual modalities. Concerning the first part, electrical neuroimaging of event-related potentials identified the spatio-temporal mechanisms subserving auditory (see Appendix, Study n°1) and vibrotactile (see Appendix, Study n°2) processing during two types of blocks of trials. `What' blocks varied stimuli in their frequency independently of their location.. `Where' blocks varied the same stimuli in their location independently of their frequency. Concerning the second part (see Appendix, Study n°3), a psychophysical task-switching paradigm was used to investigate the hypothesis that the efficacy of control processes depends on the extent of overlap between the neural circuitry mediating the different tasks at hand, such that more effective task preparation (and by extension smaller switch costs) is achieved when the anatomical/functional overlap of this circuitry is small. Performance costs associated with switching tasks and/or switching sensory modalities were measured. Tasks required the analysis of either the identity or spatial location of environmental objects (`what' and `where' tasks, respectively) that were presented either visually or acoustically on any given trial. Pretrial cues informed participants of the upcoming task, but not of the sensory modality. - In the audio-visual domain, the results showed that switch costs between tasks were significantly smaller when the sensory modality of the task switched versus when it repeated. In addition, switch costs between the senses were correlated only when the sensory modality of the task repeated across trials and not when it switched. The collective evidence not only supports the independence of control processes mediating task switching and modality switching, but also the hypothesis that switch costs reflect competitive interterence between neural circuits that in turn can be diminished when these neural circuits are distinct. - In the auditory and somatosensory domains, the findings show that a segregation of location vs. recognition information is observed across sensory systems and that these happen around 100ms for both sensory modalities. - Also, our results show that functionally specialized pathways for audition and somatosensation involve largely overlapping brain regions, i.e. posterior superior and middle temporal cortices and inferior parietal areas. Both these properties (synchrony of differential processing and overlapping brain regions) probably optimize the relationships across sensory modalities. - Therefore, these results may be indicative of a computationally advantageous organization for processing spatial anal identity information.
Resumo:
Presentemente, o controlo de gestão está vocacionado para agir antes de os factos indesejáveis ocorrerem, assegurando que os objectivos estabelecidos pela gestão são atingidos dentro do timing fixado. Além disso, o controlo de gestão deve ser o motor que permita alcançar as melhores performances nas áreas críticas da empresa, não só no domínio económico e financeiro, mas também nas áreas do crescimento, segurança e produtividade. Um dos mais importantes objectos das administrações actuais, é determinar se o desempenho da organização está de acordo com o que foi estabelecido previamente, ou seja, seus objectivos e metas. O meio através do qual se verificaria este desempenho seria a utilização de métodos e sistemas de avaliação de desempenho eficazes. Neste contexto, o presente estudo consiste em fazer um estudo exploratório descritivo identificando e averiguando de que forma as instituições bancárias de Cabo Verde efectuam a gestão de alguns aspectos, especialmente a avaliação de desempenho e o controlo estratégico, e que indicadores utilizam. Não obstante os objectivos específicos do trabalho serem outros, também damos especial atenção às características do mercado cabo-verdiano e à importância do sector bancário para a economia. Finalmente, apresentamos o Balanced Scorecard como uma ferramenta capaz de suprir as dificuldades da avaliação de desempenho e o conjunto de indicadores que vemos como o mais adequado. Neste ponto, concentramos nas quatro perspectivas básicas e no mapa estratégico, referindo o papel do Balanced Scorecard no alinhamento estratégico e na avaliação do desempenho organizacional. Para concluir, reforçamos o estudo, entrevistando um especialista (Director Financeiro) de um dos bancos da praça, cujo nome prometemos não publicar. Dessa forma, esperamos contribuir para uma melhor percepção da realidade em estudo, tanto do ponto de vista teórico, quanto da verificação das práticas no sector. Presently, the management control is oriented to act before the undesirable facts happen, assuring that the management established objectives are being achieved in the fixed timing. Besides, the management control must be an engine that permits to achieve the best performances at critical company areas, not only in the economic and financial areas, but at the growth, security and productivity areas too. One of the most important administration objects nowadays is to know if the organization performance is according to the fixed targets. The performance measurement could be done through effective methods and performance measurement systems. That’s why this assignment consists in doing an exploratory and descriptive study, identifying and investigating how the bank institutions of Cape Verde manage some things, particularly the performance measurement and the strategic control, and to know which indicators they use. Although the specific objectives of this assignment are others, we also give special attention to the Capeverdean market characteristics, and to the relevance of the banking industry to the economy of the country. Finally, we present the Balanced Scorecard as a competent tool to supply the measurement performance difficulties and a number of indicators that we find appropriate. In this point, we focus in the four basic perspectives and the strategic map, referring to the role of the Balanced Scorecard in the strategic alignment and organization performance measurement. We conclude this study with an interview to an expert (A Financial Manager) of a bank working in Cape Verde, whose name we promise to preserve. In this way, we hope to contribute to a better perception of this reality, in the theoretical point-of-view as much as in the practical check of this industry’s labour.
Resumo:
In the homogeneous case of one type of goods or objects, we prove theexistence of an additive utility function without assuming transitivityof indifference and independence. The representation reveals a positivefactor smaller than 1 that infuences rational choice beyond the utilityfunction and explains departures from these standard axioms of utilitytheory (factor equals to 1).
Resumo:
Hierarchical clustering is a popular method for finding structure in multivariate data,resulting in a binary tree constructed on the particular objects of the study, usually samplingunits. The user faces the decision where to cut the binary tree in order to determine the numberof clusters to interpret and there are various ad hoc rules for arriving at a decision. A simplepermutation test is presented that diagnoses whether non-random levels of clustering are presentin the set of objects and, if so, indicates the specific level at which the tree can be cut. The test isvalidated against random matrices to verify the type I error probability and a power study isperformed on data sets with known clusteredness to study the type II error.
Resumo:
Models of the exchange process based on search theory can be usedto analyze the features of objects that make them more or less likely toemerge as ``money'' in equilibrium. These models illustrate the trade--offbetween endogenous acceptability (an equilibrium property) and intrinsiccharacteristics of goods, such as storability, recognizability, etc. Inthis paper, we look at how the relative supply and demand for various goodsaffect their likelihood of becoming money. Intuitively, goods in highdemand and/or low supply are more likely to appear as commodity money,subject to the qualification that which object ends up circulating as amedium of exchange depends at least partly on convention. Welfare propertiesare discussed.