902 resultados para Deterministic imputation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, an explosion of interest in neuroscience has led to the development of "Neuro-law," a new multidisciplinary field of knowledge whose aim is to examine the impact and role of neuroscientific findings in legal proceedings. Neuroscientific evidence is increasingly being used in US and European courts in criminal trials, as part of psychiatric testimony, nourishing the debate about the legal implications of brain research in psychiatric-legal settings. During these proceedings, the role of forensic psychiatrists is crucial. In most criminal justice systems, their mission consists in accomplishing two basic tasks: assessing the degree of responsibility of the offender and evaluating their future dangerousness. In the first part of our research, we aim to examine the impact of Neuroscientific evidence in the assessment of criminal responsibility, a key concept of law. An initial jurisprudential research leads to conclude that there are significant difficulties and limitations in using neuroscience for the assessment of criminal responsibility. In the current socio-legal context, responsibility assessments are progressively being weakened, whereas dangerousness assessments gain increasing importance in the field of forensic psychiatry. In the second part of our research we concentrate on the impact of using neuroscience for the assessment of dangerousness. We argue that in the current policy era of zero tolerance, judges, confronted with the pressure to ensure public security, may tend to interpret neuroscientific knowledge and data as an objective and reliable way of evaluating one's dangerousness and risk of reoffending, rather than their responsibility. This tendency could be encouraged by a utilitarian approach to punishment, advanced by some recent neuroscientific research which puts into question the existence of free will and responsibility and argues for a rejection of the retributive theory of punishment. Although this shift away from punishment aimed at retribution in favor of a consequentialist approach to criminal law is advanced by some authors as a more progressive and humane approach, we believe that it could lead to the instrumentalisation of neuroscience in the interest of public safety, which can run against the proper exercise of justice and civil liberties of the offenders. By advancing a criminal law regime animated by the consequentialist aim of avoiding social harms through rehabilitation, neuroscience promotes a return to a therapeutical approach to crime which can have serious impact on the kind and the length of sentences imposed on the offenders; if neuroscientific data are interpreted as evidence of dangerousness, rather than responsibility, it is highly likely that judges impose heavier sentences, or/and security measures (in civil law systems), which can be indeterminate in length. Errors and epistemic traps of past criminological movements trying to explain the manifestation of a violent and deviant behavior on a biological and deterministic basis stress the need for caution concerning the use of modern neuroscientific methods in criminal proceedings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A collection of spherical obstacles in the unit ball in Euclidean space is said to be avoidable for Brownian motion if there is a positive probability that Brownian motion diffusing from some point in the ball will avoid all the obstacles and reach the boundary of the ball. The centres of the spherical obstacles are generated according to a Poisson point process while the radius of an obstacle is a deterministic function. If avoidable configurations are generated with positive probability, Lundh calls this percolation diffusion. An integral condition for percolation diffusion is derived in terms of the intensity of the point process and the function that determines the radii of the obstacles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Marketing has studied the permanence of a client within an enterprise because it is a key element in the study of the value (economic) of the client (CLV). The research that they have developed is based in deterministic or random models, which allowed estimating the permanence of the client, and the CLV. However, when it is not possible to apply these schemes for not having the panel data that this model requires, the period of time of a client with the enterprise is uncertain data. We consider that the value of the current work is to have an alternative way to estimate the period of time with subjective information proper of the theory of uncertainty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The multifractal dimension of chaotic attractors has been studied in a weakly coupled superlattice driven by an incommensurate sinusoidal voltage as a function of the driving voltage amplitude. The derived multifractal dimension for the observed bifurcation sequence shows different characteristics for chaotic, quasiperiodic, and frequency-locked attractors. In the chaotic regime, strange attractors are observed. Even in the quasiperiodic regime, attractors with a certain degree of strangeness may exist. From the observed multifractal dimensions, the deterministic nature of the chaotic oscillations is clearly identified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A recent publication reported an exciting polygenic effect of schizophrenia (SCZ) risk variants, identified by a large genome-wide association study (GWAS), on total brain and white matter volumes in schizophrenic patients and, even more prominently, in healthy subjects. The aim of the present work was to replicate and then potentially extend these findings. According to the original publication, polygenic risk scores using single nucleotide polymorphism (SNP) information of SCZ GWAS (polygenic SCZ risk scores; PSS) were calculated in 122 healthy subjects, enrolled in a structural magnetic resonance imaging (MRI) study. These scores were computed based on P-values and odds ratios available through the Psychiatric GWAS Consortium. In addition, polygenic white matter scores (PWM) were calculated, using the respective SNP subset in the original publication. None of the polygenic scores, either PSS or PWM, were found to be associated with total brain, white matter or gray matter volume in our replicate sample. Minor differences between the original and the present study that might have contributed to lack of reproducibility (but unlikely explain it fully), are number of subjects, ethnicity, age distribution, array technology, SNP imputation quality and MRI scanner type. In contrast to the original publication, our results do not reveal the slightest signal of association of the described sets of GWAS-identified SCZ risk variants with brain volumes in adults. Caution is indicated in interpreting studies building on polygenic risk scores without replication sample.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkimuksen ongelma on toimitusten ajallisen täsmällisyyden heikkeneminen kuljetuspolkujen pidentyessä ja erillisten suoritteitten lukumäärän kasvaessa. Näin usein tapahtuu toiminnan kansainvälistyessä. Tutkimuksen tavoitteena on kehittää menetelmä, jolla toimitusten ajallista täsmällisyyttä voidaan suunnitella ja ohjata sekä sen tavoitteet saavuttaa yllä todetusta kehityksestä huolimatta. Aluksi tutkimuksessa on jäsennetty toimitustäsmällisyyden nivoutumista yritys ja markkinointistrategioiden sekä tavoitteiden kokonaisuuteen. Täsmällisyyden merkitystä on myös tarkasteltu yleisesti yritysten kilpailutekijänä. Seuraavaksi on kehitetty interaktiivinen menetelmä, jonka avulla voidaan suunnitella yksittäisen toimituksen ajallista täsmällisyyttä. Toimituksen suoritteet ja niiden vaihtoehdot on kuvattu kvantifioituna, suunnattuna graafina eli verkkona. Ensin on tarkasteltu tilannetta, jossa suoritteiden kestot ovat vakiot ja sitten suoritteiden keston hajonnan huomioon ottavaa menetelmää, jonka avulla on löydettävissä kustannuksiltaan halvin, riittävän täsmällinen ja luotettava ratkaisu. Menetelmän soveltamista on tarkasteltu yleisesti sekä tulo että lähtölogistiikassa, dynaamisessa toimitussuoritetta koskevassa paatoksen teossa ja erilaisissa erityistapauksissa. Täsmällisyystavoitteitten asettamista ja muuttujia on tarkasteltu yleisesti ja käytännön soveltamisen kannalta. Käytännön soveltamista on kokeiltu valitun kohdeyrityksen esimerkkitilanteessa. Menetelmän toimivuutta on testattu simuloimalla sen ja perinteisesti suunniteltujen tehdastoimitusten sekä paikallisvarastosta tehtävien toimitusten tuloksia ja kustannuksia sekä vertaamalla niitä keskenään. Analyysin lopuksi on tarkasteltu esimerkkitilannetta logistisena kokonaisuutena lisäämällä vertailuun muitten kustannuskomponenttien ja lisäarvon vaikutus. Lopussa on käsitelty menetelmän käytännön soveltuvuusprofiilia ja jatkotutkimusaiheita. Yhteenvetona on todettu kehitetyn menetelmän tekevän täsmällisyyttä korostavan toimitusstrategian luotettavan toteuttamisen mahdolliseksi. Sen on taas todettu voivan olla edullinen vaihtoehto, jos täsmällisyydellä on markkina arvoa ja merkitystä pitkän tähtäimen menestystekijänä.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El present projecte realitza una anàlisi de les claus criptogràfiques utilitzades en bitcoin. El projecte introdueix les nocions bàsiques necessàries de les corbes el·líptiques, la criptografia de corbes el·líptiques i els bitcoins per a realitzar l’anàlisi. Aquesta anàlisi consisteix en explorar el codi de diferents wallets bitcoin i realitzar un estudi empíric de l’aleatorietat de les claus. Per últim, el projecte introdueix el concepte de wallet determinista, el seu funcionament i alguns dels problemes que presenta.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new analytical tools able to predict the averaged behavior of fronts spreading through self-similar spatial systems starting from reaction-diffusion equations. The averaged speed for these fronts is predicted and compared with the predictions from a more general equation (proposed in a previous work of ours) and simulations. We focus here on two fractals, the Sierpinski gasket (SG) and the Koch curve (KC), for two reasons, i.e. i) they are widely known structures and ii) they are deterministic fractals, so the analytical study of them turns out to be more intuitive. These structures, despite their simplicity, let us observe several characteristics of fractal fronts. Finally, we discuss the usefulness and limitations of our approa

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Practical Stochastic Model is a simple and robust method to describe coupled chemical reactions. The connection between this stochastic method and a deterministic method was initially established to understand how the parameters and variables that describe the concentration in both methods were related. It was necessary to define two main concepts to make this connection: the filling of compartments or dilutions and the rate of reaction enhancement. The parameters, variables, and the time of the stochastic methods were scaled with the size of the compartment and were compared with a deterministic method. The deterministic approach was employed as an initial reference to achieve a consistent stochastic result. Finally, an independent robust stochastic method was obtained. This method could be compared with the Stochastic Simulation Algorithm developed by Gillespie, 1977. The Practical Stochastic Model produced absolute values that were essential to describe non-linear chemical reactions with a simple structure, and allowed for a correct description of the chemical kinetics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to its non-storability, electricity must be produced at the same time that it is consumed, as a result prices are determined on an hourly basis and thus analysis becomes more challenging. Moreover, the seasonal fluctuations in demand and supply lead to a seasonal behavior of electricity spot prices. The purpose of this thesis is to seek and remove all causal effects from electricity spot prices and remain with pure prices for modeling purposes. To achieve this we use Qlucore Omics Explorer (QOE) for the visualization and the exploration of the data set and Time Series Decomposition method to estimate and extract the deterministic components from the series. To obtain the target series we use regression based on the background variables (water reservoir and temperature). The result obtained is three price series (for Sweden, Norway and System prices) with no apparent pattern.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this Thesis the interaction of an electromagnetic field and matter is studied from various aspects in the general framework of cold atoms. Our subjects cover a wide spectrum of phenomena ranging from semiclassical few-level models to fully quantum mechanical interaction with structured reservoirs leading to non-Markovian open quantum system dynamics. Within closed quantum systems, we propose a selective method to manipulate the motional state of atoms in a time-dependent double-well potential and interpret the method in terms of adiabatic processes. Also, we derive a simple wave-packet model, based on distributions of generalized eigenstates, explaining the finite visibility of interference in overlapping continuous-wave atom lasers. In the context of open quantum systems, we develop an unraveling of non-Markovian dynamics in terms of piecewise deterministic quantum jump processes confined in the Hilbert space of the reduced system - the non-Markovian quantum jump method. As examples, we apply it for simple 2- and 3-level systems interacting with a structured reservoir. Also, in the context of ion-cavity QED we study the entanglement generation based on collective Dicke modes in experimentally realistic conditions including photonic losses and an atomic spontaneous decay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study is to analyse the content of the interdisciplinary conversations in Göttingen between 1949 and 1961. The task is to compare models for describing reality presented by quantum physicists and theologians. Descriptions of reality indifferent disciplines are conditioned by the development of the concept of reality in philosophy, physics and theology. Our basic problem is stated in the question: How is it possible for the intramental image to match the external object?Cartesian knowledge presupposes clear and distinct ideas in the mind prior to observation resulting in a true correspondence between the observed object and the cogitative observing subject. The Kantian synthesis between rationalism and empiricism emphasises an extended character of representation. The human mind is not a passive receiver of external information, but is actively construing intramental representations of external reality in the epistemological process. Heidegger's aim was to reach a more primordial mode of understanding reality than what is possible in the Cartesian Subject-Object distinction. In Heidegger's philosophy, ontology as being-in-the-world is prior to knowledge concerning being. Ontology can be grasped only in the totality of being (Dasein), not only as an object of reflection and perception. According to Bohr, quantum mechanics introduces an irreducible loss in representation, which classically understood is a deficiency in knowledge. The conflicting aspects (particle and wave pictures) in our comprehension of physical reality, cannot be completely accommodated into an entire and coherent model of reality. What Bohr rejects is not realism, but the classical Einsteinian version of it. By the use of complementary descriptions, Bohr tries to save a fundamentally realistic position. The fundamental question in Barthian theology is the problem of God as an object of theological discourse. Dialectics is Barth¿s way to express knowledge of God avoiding a speculative theology and a human-centred religious self-consciousness. In Barthian theology, the human capacity for knowledge, independently of revelation, is insufficient to comprehend the being of God. Our knowledge of God is real knowledge in revelation and our words are made to correspond with the divine reality in an analogy of faith. The point of the Bultmannian demythologising programme was to claim the real existence of God beyond our faculties. We cannot simply define God as a human ideal of existence or a focus of values. The theological programme of Bultmann emphasised the notion that we can talk meaningfully of God only insofar as we have existential experience of his intervention. Common to all these twentieth century philosophical, physical and theological positions, is a form of anti-Cartesianism. Consequently, in regard to their epistemology, they can be labelled antirealist. This common insight also made it possible to find a common meeting point between the different disciplines. In this study, the different standpoints from all three areas and the conversations in Göttingen are analysed in the frameworkof realism/antirealism. One of the first tasks in the Göttingen conversations was to analyse the nature of the likeness between the complementary structures inquantum physics introduced by Niels Bohr and the dialectical forms in the Barthian doctrine of God. The reaction against epistemological Cartesianism, metaphysics of substance and deterministic description of reality was the common point of departure for theologians and physicists in the Göttingen discussions. In his complementarity, Bohr anticipated the crossing of traditional epistemic boundaries and the generalisation of epistemological strategies by introducing interpretative procedures across various disciplines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study examines the internationalisation process of a contemporary SME firm and explores the impact of its business network on this development. The objective of the study is to understand SME internationalisation and its dynamics from a network perspective. The purpose of this research project is to describe and explore the development process of a firm and its business network by identifying the changes, critical events and influence factors that form this development. It is a qualitative case study, which focuses on a Finnish focal firm and its respective business network as it expands into the Greek market. It is a longitudinal research process, which covers a period of time from 1994 to 2004. The empirical study concentrates on the paper trading and converting business. The study builds on the network theory and the framework provided by Johanson and Mattsson's (1988) model on network internationalisation. The incremental internationalisation theories and network theories form the theoretical focus. The research project is organised according to a process view. The focal firm evolves from a domestically-oriented small subsidiary into an internationally experienced company, which has activities in several market areas and numerous business networks in various market segments and product categories. The findings illustrate the importance of both the domestic and foreign business network context in a firm's internationalisation process. The results of the study suggest theoretical modifications on a firm's internationalisation process by broadening the perspective and incorporating the strategic context of a firm. The findings suggest that internationalisation process is a non-linear process, which does not have a deterministic order in its development. The findings emphasise the significance of relational networks, both managerial and entrepreneurial, for establishing position in foreign markets. It implies that a firm's evolution is significantly influenced by its business network and by critical events. Business networks gain coherence due to common goals and they use accumulated capabilities to exploit market opportunities. The business network sets constraints and provides opportunities, which makes the related decision making strategically important. The firm co-evolves with its business network. The research project provides an instrumental case study with a description of an SME internationalisation process. It contributes to existing knowledge by illustrating dynamics in an international business network and by pinpointing the importance of suppliers, customers, partners, ownerships and competition to the internationalisation process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mathematical models often contain parameters that need to be calibrated from measured data. The emergence of efficient Markov Chain Monte Carlo (MCMC) methods has made the Bayesian approach a standard tool in quantifying the uncertainty in the parameters. With MCMC, the parameter estimation problem can be solved in a fully statistical manner, and the whole distribution of the parameters can be explored, instead of obtaining point estimates and using, e.g., Gaussian approximations. In this thesis, MCMC methods are applied to parameter estimation problems in chemical reaction engineering, population ecology, and climate modeling. Motivated by the climate model experiments, the methods are developed further to make them more suitable for problems where the model is computationally intensive. After the parameters are estimated, one can start to use the model for various tasks. Two such tasks are studied in this thesis: optimal design of experiments, where the task is to design the next measurements so that the parameter uncertainty is minimized, and model-based optimization, where a model-based quantity, such as the product yield in a chemical reaction model, is optimized. In this thesis, novel ways to perform these tasks are developed, based on the output of MCMC parameter estimation. A separate topic is dynamical state estimation, where the task is to estimate the dynamically changing model state, instead of static parameters. For example, in numerical weather prediction, an estimate of the state of the atmosphere must constantly be updated based on the recently obtained measurements. In this thesis, a novel hybrid state estimation method is developed, which combines elements from deterministic and random sampling methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In any decision making under uncertainties, the goal is mostly to minimize the expected cost. The minimization of cost under uncertainties is usually done by optimization. For simple models, the optimization can easily be done using deterministic methods.However, many models practically contain some complex and varying parameters that can not easily be taken into account using usual deterministic methods of optimization. Thus, it is very important to look for other methods that can be used to get insight into such models. MCMC method is one of the practical methods that can be used for optimization of stochastic models under uncertainty. This method is based on simulation that provides a general methodology which can be applied in nonlinear and non-Gaussian state models. MCMC method is very important for practical applications because it is a uni ed estimation procedure which simultaneously estimates both parameters and state variables. MCMC computes the distribution of the state variables and parameters of the given data measurements. MCMC method is faster in terms of computing time when compared to other optimization methods. This thesis discusses the use of Markov chain Monte Carlo (MCMC) methods for optimization of Stochastic models under uncertainties .The thesis begins with a short discussion about Bayesian Inference, MCMC and Stochastic optimization methods. Then an example is given of how MCMC can be applied for maximizing production at a minimum cost in a chemical reaction process. It is observed that this method performs better in optimizing the given cost function with a very high certainty.