900 resultados para Time inventory models
Resumo:
In this correspondence, we propose applying the hiddenMarkov models (HMM) theory to the problem of blind channel estimationand data detection. The Baum–Welch (BW) algorithm, which is able toestimate all the parameters of the model, is enriched by introducingsome linear constraints emerging from a linear FIR hypothesis on thechannel. Additionally, a version of the algorithm that is suitable for timevaryingchannels is also presented. Performance is analyzed in a GSMenvironment using standard test channels and is found to be close to thatobtained with a nonblind receiver.
Resumo:
Tässä diplomityössä tarkastellaan massaräätälöintiä ja modulointia parkettiliiketoiminnassa yhden case-yrityksen näkökulmasta. Työn tavoitteena on kuvata nykyinen toiminta ja luoda massaräätälöintiin ja modulointiin perustuva toimintamalli sekä selvittää uudistetun toimintamallin taloudelliset ja toiminnalliset vaikutukset. Kirjallisuustutkimuksessa tarkastellaan massaräätälöinnin ja moduloinnin käsitettä sekä näihin käsitteisiin liittyviä ohjauksellisia menetelmiä. Tutkimuksessa kuvataan edellä mainittujen käsitteiden perusmallit ja siirtyminen näiden mallien käyttämiseen. Diplomityön empiirisessä osassa tutkitaan massaräätälöinnin ja moduloinnin taloudellisia vaikutuksia case-yrityksessä. Taloudellisia vaikutuksia tutkittiin kolmella eri tavalla. Ensimmäisenä selvitettiin toimintamallin muutoksen vaikutus tuotantoon ja valmisvarastoon sitoutuneenvaihto-omaisuuden määrään. Tämän jälkeen tutkittiin vaikutukset tuotannon läpimenoaikoihin sekä tehtiin laskelmat toimintamallin muuttamisen vaatimista investoinneista ja niistä saavutettavista hyödyistä. Empiiriset tulokset osoittavat, että case-yrityksen nykyisessä toiminnassa on puutteita. Tuotantoon sitoutuu liikaa pääomaa ja tuotannon läpäisyajat ovat liian pitkiä asiakasvaatimuksiin nähden. Tulosten mukaan massaräätälöinnin ja moduloinnin keinoilla voidaan saavuttaa merkittäviä parannuksia tuotannon läpimenoon, pääoman sitoutumiseen ja saadaan aikaan mittavia rahallisia säästöjä. Tulosten perusteella suositellaan, että yritys ottaa käyttöön massaräätälöinnin ja moduloinnin menetelmät kuvatun toimintamallin mukaisesti.
Resumo:
Background Depression is one of the more severe and serious health problems because of its morbidity, disabling effects and for its societal and economic burden. Despite the variety of existing pharmacological and psychological treatments, most of the cases evolve with only partial remission, relapse and recurrence. Cognitive models have contributed significantly to the understanding of unipolar depression and its psychological treatment. However, success is only partial and many authors affirm the need to improve those models and also the treatment programs derived from them. One of the issues that requires further elaboration is the difficulty these patients experience in responding to treatment and in maintaining therapeutic gains across time without relapse or recurrence. Our research group has been working on the notion of cognitive conflict viewed as personal dilemmas according to personal construct theory. We use a novel method for identifying those conflicts using the repertory grid technique (RGT). Preliminary results with depressive patients show that about 90% of them have one or more of those conflicts. This fact might explain the blockage and the difficult progress of these patients, especially the more severe and/or chronic. These results justify the need for specific interventions focused on the resolution of these internal conflicts. This study aims to empirically test the hypothesis that an intervention focused on the dilemma(s) specifically detected for each patient will enhance the efficacy of cognitive behavioral therapy (CBT) for depression. Design A therapy manual for a dilemma-focused intervention will be tested using a randomized clinical trial by comparing the outcome of two treatment conditions: combined group CBT (eight, 2-hour weekly sessions) plus individual dilemma-focused therapy (eight, 1-hour weekly sessions) and CBT alone (eight, 2-hour group weekly sessions plus eight, 1-hour individual weekly sessions). Method Participants are patients aged over 18 years meeting diagnostic criteria for major depressive disorder or dysthymic disorder, with a score of 19 or above on the Beck depression inventory, second edition (BDI-II) and presenting at least one cognitive conflict (implicative dilemma or dilemmatic construct) as assessed using the RGT. The BDI-II is the primary outcome measure, collected at baseline, at the end of therapy, and at 3- and 12-month follow-up; other secondary measures are also used. Discussion We expect that adding a dilemma-focused intervention to CBT will increase the efficacy of one of the more prestigious therapies for depression, thus resulting in a significant contribution to the psychological treatment of depression. Trial registration ISRCTN92443999; ClinicalTrials.gov Identifier: NCT01542957.
Resumo:
The -function and the -function are phenomenological models that are widely used in the context of timing interceptive actions and collision avoidance, respectively. Both models were previously considered to be unrelated to each other: is a decreasing function that provides an estimation of time-to-contact (ttc) in the early phase of an object approach; in contrast, has a maximum before ttc. Furthermore, it is not clear how both functions could be implemented at the neuronal level in a biophysically plausible fashion. Here we propose a new framework the corrected modified Tau function capable of predicting both -type ("") and -type ("") responses. The outstanding property of our new framework is its resilience to noise. We show that can be derived from a firing rate equation, and, as , serves to describe the response curves of collision sensitive neurons. Furthermore, we show that predicts the psychophysical performance of subjects determining ttc. Our new framework is thus validated successfully against published and novel experimental data. Within the framework, links between -type and -type neurons are established. Therefore, it could possibly serve as a model for explaining the co-occurrence of such neurons in the brain.
Resumo:
Objective To develop a Postnatal Perceived Stress Inventory (PNPSI) and assess its psychometric properties. Design Cross-sectional quantitative study. Setting One nurse-managed labor and delivery unit in a university hospital in a major metropolitan area. Participants One hundred seventy-nine (179) primiparous French speaking women who gave birth at term. Methods The PNPSI was validated at 6 weeks postpartum. Its predictive validity for depression and anxiety was assessed at the same time. Results The exploratory analysis revealed a 19-item structure divided into six factors. This inventory has good internal consistency (Cronbach's alpha = .815). The predictive validity shows that the PNPSI significantly predicts depression and anxiety at 6 weeks postpartum, and that certain factors are particularly prominent. Conclusion The PNPSI's psychometric properties make it a useful tool for future research to evaluate interventions for perceived stress during the postnatal period. Its predictive power for depression indicates that it is also a promising tool for clinical settings.
Resumo:
Abstract The main objective of this work is to show how the choice of the temporal dimension and of the spatial structure of the population influences an artificial evolutionary process. In the field of Artificial Evolution we can observe a common trend in synchronously evolv¬ing panmictic populations, i.e., populations in which any individual can be recombined with any other individual. Already in the '90s, the works of Spiessens and Manderick, Sarma and De Jong, and Gorges-Schleuter have pointed out that, if a population is struc¬tured according to a mono- or bi-dimensional regular lattice, the evolutionary process shows a different dynamic with respect to the panmictic case. In particular, Sarma and De Jong have studied the selection pressure (i.e., the diffusion of a best individual when the only selection operator is active) induced by a regular bi-dimensional structure of the population, proposing a logistic modeling of the selection pressure curves. This model supposes that the diffusion of a best individual in a population follows an exponential law. We show that such a model is inadequate to describe the process, since the growth speed must be quadratic or sub-quadratic in the case of a bi-dimensional regular lattice. New linear and sub-quadratic models are proposed for modeling the selection pressure curves in, respectively, mono- and bi-dimensional regu¬lar structures. These models are extended to describe the process when asynchronous evolutions are employed. Different dynamics of the populations imply different search strategies of the resulting algorithm, when the evolutionary process is used to solve optimisation problems. A benchmark of both discrete and continuous test problems is used to study the search characteristics of the different topologies and updates of the populations. In the last decade, the pioneering studies of Watts and Strogatz have shown that most real networks, both in the biological and sociological worlds as well as in man-made structures, have mathematical properties that set them apart from regular and random structures. In particular, they introduced the concepts of small-world graphs, and they showed that this new family of structures has interesting computing capabilities. Populations structured according to these new topologies are proposed, and their evolutionary dynamics are studied and modeled. We also propose asynchronous evolutions for these structures, and the resulting evolutionary behaviors are investigated. Many man-made networks have grown, and are still growing incrementally, and explanations have been proposed for their actual shape, such as Albert and Barabasi's preferential attachment growth rule. However, many actual networks seem to have undergone some kind of Darwinian variation and selection. Thus, how these networks might have come to be selected is an interesting yet unanswered question. In the last part of this work, we show how a simple evolutionary algorithm can enable the emrgence o these kinds of structures for two prototypical problems of the automata networks world, the majority classification and the synchronisation problems. Synopsis L'objectif principal de ce travail est de montrer l'influence du choix de la dimension temporelle et de la structure spatiale d'une population sur un processus évolutionnaire artificiel. Dans le domaine de l'Evolution Artificielle on peut observer une tendence à évoluer d'une façon synchrone des populations panmictiques, où chaque individu peut être récombiné avec tout autre individu dans la population. Déjà dans les année '90, Spiessens et Manderick, Sarma et De Jong, et Gorges-Schleuter ont observé que, si une population possède une structure régulière mono- ou bi-dimensionnelle, le processus évolutionnaire montre une dynamique différente de celle d'une population panmictique. En particulier, Sarma et De Jong ont étudié la pression de sélection (c-à-d la diffusion d'un individu optimal quand seul l'opérateur de sélection est actif) induite par une structure régulière bi-dimensionnelle de la population, proposant une modélisation logistique des courbes de pression de sélection. Ce modèle suppose que la diffusion d'un individu optimal suit une loi exponentielle. On montre que ce modèle est inadéquat pour décrire ce phénomène, étant donné que la vitesse de croissance doit obéir à une loi quadratique ou sous-quadratique dans le cas d'une structure régulière bi-dimensionnelle. De nouveaux modèles linéaires et sous-quadratique sont proposés pour des structures mono- et bi-dimensionnelles. Ces modèles sont étendus pour décrire des processus évolutionnaires asynchrones. Différentes dynamiques de la population impliquent strategies différentes de recherche de l'algorithme résultant lorsque le processus évolutionnaire est utilisé pour résoudre des problèmes d'optimisation. Un ensemble de problèmes discrets et continus est utilisé pour étudier les charactéristiques de recherche des différentes topologies et mises à jour des populations. Ces dernières années, les études de Watts et Strogatz ont montré que beaucoup de réseaux, aussi bien dans les mondes biologiques et sociologiques que dans les structures produites par l'homme, ont des propriétés mathématiques qui les séparent à la fois des structures régulières et des structures aléatoires. En particulier, ils ont introduit la notion de graphe sm,all-world et ont montré que cette nouvelle famille de structures possède des intéressantes propriétés dynamiques. Des populations ayant ces nouvelles topologies sont proposés, et leurs dynamiques évolutionnaires sont étudiées et modélisées. Pour des populations ayant ces structures, des méthodes d'évolution asynchrone sont proposées, et la dynamique résultante est étudiée. Beaucoup de réseaux produits par l'homme se sont formés d'une façon incrémentale, et des explications pour leur forme actuelle ont été proposées, comme le preferential attachment de Albert et Barabàsi. Toutefois, beaucoup de réseaux existants doivent être le produit d'un processus de variation et sélection darwiniennes. Ainsi, la façon dont ces structures ont pu être sélectionnées est une question intéressante restée sans réponse. Dans la dernière partie de ce travail, on montre comment un simple processus évolutif artificiel permet à ce type de topologies d'émerger dans le cas de deux problèmes prototypiques des réseaux d'automates, les tâches de densité et de synchronisation.
Resumo:
Over 70% of the total costs of an end product are consequences of decisions that are made during the design process. A search for optimal cross-sections will often have only a marginal effect on the amount of material used if the geometry of a structure is fixed and if the cross-sectional characteristics of its elements are property designed by conventional methods. In recent years, optimalgeometry has become a central area of research in the automated design of structures. It is generally accepted that no single optimisation algorithm is suitable for all engineering design problems. An appropriate algorithm, therefore, mustbe selected individually for each optimisation situation. Modelling is the mosttime consuming phase in the optimisation of steel and metal structures. In thisresearch, the goal was to develop a method and computer program, which reduces the modelling and optimisation time for structural design. The program needed anoptimisation algorithm that is suitable for various engineering design problems. Because Finite Element modelling is commonly used in the design of steel and metal structures, the interaction between a finite element tool and optimisation tool needed a practical solution. The developed method and computer programs were tested with standard optimisation tests and practical design optimisation cases. Three generations of computer programs are developed. The programs combine anoptimisation problem modelling tool and FE-modelling program using three alternate methdos. The modelling and optimisation was demonstrated in the design of a new boom construction and steel structures of flat and ridge roofs. This thesis demonstrates that the most time consuming modelling time is significantly reduced. Modelling errors are reduced and the results are more reliable. A new selection rule for the evolution algorithm, which eliminates the need for constraint weight factors is tested with optimisation cases of the steel structures that include hundreds of constraints. It is seen that the tested algorithm can be used nearly as a black box without parameter settings and penalty factors of the constraints.
Resumo:
The present study tests the relationships between the three frequently used personality models evaluated by the Temperament Character Inventory-Revised (TCI-R), Neuroticism Extraversion Openness Five Factor Inventory – Revised (NEO-FFI-R) and Zuckerman-Kuhlman Personality Questionnaire-50- Cross-Cultural (ZKPQ-50-CC). The results were obtained with a sample of 928 volunteer subjects from the general population aged between 17 and 28 years old. Frequency distributions and alpha reliabilities with the three instruments were acceptable. Correlational and factorial analyses showed that several scales in the three instruments share an appreciable amount of common variance. Five factors emerged from principal components analysis. The first factor was integrated by A (Agreeableness), Co (Cooperativeness) and Agg-Host (Aggressiveness-Hostility), with secondary loadings in C (Conscientiousness) and SD (Self-directiveness) from other factors. The second factor was composed by N (Neuroticism), N-Anx (Neuroticism-Anxiety), HA (Harm Avoidance) and SD (Self-directiveness). The third factor was integrated by Sy (Sociability), E (Extraversion), RD (Reward Dependence), ImpSS (Impulsive Sensation Seeking) and NS (novelty Seeking). The fourth factor was integrated by Ps (Persistence), Act (Activity), and C, whereas the fifth and last factor was composed by O (Openness) and ST (Self- Transcendence). Confirmatory factor analyses indicate that the scales in each model are highly interrelated and define the specified latent dimension well. Similarities and differences between these three instruments are further discussed.
Resumo:
The paper is motivated by the valuation problem of guaranteed minimum death benefits in various equity-linked products. At the time of death, a benefit payment is due. It may depend not only on the price of a stock or stock fund at that time, but also on prior prices. The problem is to calculate the expected discounted value of the benefit payment. Because the distribution of the time of death can be approximated by a combination of exponential distributions, it suffices to solve the problem for an exponentially distributed time of death. The stock price process is assumed to be the exponential of a Brownian motion plus an independent compound Poisson process whose upward and downward jumps are modeled by combinations (or mixtures) of exponential distributions. Results for exponential stopping of a Lévy process are used to derive a series of closed-form formulas for call, put, lookback, and barrier options, dynamic fund protection, and dynamic withdrawal benefit with guarantee. We also discuss how barrier options can be used to model lapses and surrenders.
Resumo:
En el presente trabajo se presenta una revisión sobre los modelos forestales desarrollados en España durante los últimos años, tanto para la producción maderable como no maderable y, para la dinámica de los bosques (regeneración, mortalidad). Se presentan modelos tanto de rodal completo como de clases diamétricas y de árbol individual. Los modelos desarrollados hasta la fecha se han desarrollado a partir de datos procedentes de parcelas permanentes, ensayos y el Inventario Forestal Nacional. En el trabajo se muestran los diferentes submodelos desarrollados hasta la fecha, así como las plataformas informáticas que permiten utilizar dichos modelos. Se incluyen las principales perspectivas de desarrollo de la modelización forestal en España.
Resumo:
This work proposes the development of an embedded real-time fruit detection system for future automatic fruit harvesting. The proposed embedded system is based on an ARM Cortex-M4 (STM32F407VGT6) processor and an Omnivision OV7670 color camera. The future goal of this embedded vision system will be to control a robotized arm to automatically select and pick some fruit directly from the tree. The complete embedded system has been designed to be placed directly in the gripper tool of the future robotized harvesting arm. The embedded system will be able to perform real-time fruit detection and tracking by using a three-dimensional look-up-table (LUT) defined in the RGB color space and optimized for fruit picking. Additionally, two different methodologies for creating optimized 3D LUTs based on existing linear color models and fruit histograms were implemented in this work and compared for the case of red peaches. The resulting system is able to acquire general and zoomed orchard images and to update the relative tracking information of a red peach in the tree ten times per second.
Identification-commitment inventory (ICI-Model): confirmatory factor analysis and construct validity
Resumo:
The aim of this study is to confirm the factorial structure of the Identification-Commitment Inventory (ICI) developed within the frame of the Human System Audit (HSA) (Quijano et al. in Revist Psicol Soc Apl 10(2):27-61, 2000; Pap Psicól Revist Col Of Psicó 29:92-106, 2008). Commitment and identification are understood by the HSA at an individual level as part of the quality of human processes and resources in an organization; and therefore as antecedents of important organizational outcomes, such as personnel turnover intentions, organizational citizenship behavior, etc. (Meyer et al. in J Org Behav 27:665-683, 2006). The theoretical integrative model which underlies ICI Quijano et al. (2000) was tested in a sample (N = 625) of workers in a Spanish public hospital. Confirmatory factor analysis through structural equation modeling was performed. Elliptical least square solution was chosen as estimator procedure on account of non-normal distribution of the variables. The results confirm the goodness of fit of an integrative model, which underlies the relation between Commitment and Identification, although each one is operatively different.
Resumo:
Uusien paperikoneiden kysyntä on vähentynyt ja jälkimarkkinointipalveluiden, kuten huoltojen ja varaosamyyntien, merkittävyys paperikoneliiketoiminnassa on kasvanut entisestään viime aikoina. Uudentyyppisiä palveluja kilpailuedun lisäämiseksi kehitellään jatkuvasti. Esimerkki tällaisesta palvelusta on sopimusperusteinen varastointipalvelu, jossa osat ovat myyjän varastossa kunnes asiakas ottaa ne käyttöön. Diplomityön tavoite on rakentaa malli varastoinnin kustannuslaskentaan ja laskea sen avulla varastointipalvelun kustannukset. Perinteinen toimitusketju monine varastoineen ei nykykäsityksen mukaan ole enää kustannustehokas. Yhä useammat yritykset kaupan ja teollisuuden aloilla ovat ryhtyneet soveltamaan VMI (Vendor Managed Inventory) teoriaa toimitusketjuissaan. Varastot ovat tällöin keskitettyjä, tiedonkulku toimitusketjun portaiden välillä on nopeaa ja kysyntään pystytään vastaamaan lyhyemmällä viiveellä sen ennakoitavuuden paranemisen takia. Työn tuloksena on toimintolaskentaan pohjautuva kustannuslaskentamalli, jota voidaan hyödyntää myös hinnoittelupäätöksiä tehtäessä. Työssä esitellään mallin soveltaminen eri tapauksiin ja ehdotetaan jatkotoimenpiteitä.
Resumo:
Maximum entropy modeling (Maxent) is a widely used algorithm for predicting species distributions across space and time. Properly assessing the uncertainty in such predictions is non-trivial and requires validation with independent datasets. Notably, model complexity (number of model parameters) remains a major concern in relation to overfitting and, hence, transferability of Maxent models. An emerging approach is to validate the cross-temporal transferability of model predictions using paleoecological data. In this study, we assess the effect of model complexity on the performance of Maxent projections across time using two European plant species (Alnus giutinosa (L.) Gaertn. and Corylus avellana L) with an extensive late Quaternary fossil record in Spain as a study case. We fit 110 models with different levels of complexity under present time and tested model performance using AUC (area under the receiver operating characteristic curve) and AlCc (corrected Akaike Information Criterion) through the standard procedure of randomly partitioning current occurrence data. We then compared these results to an independent validation by projecting the models to mid-Holocene (6000 years before present) climatic conditions in Spain to assess their ability to predict fossil pollen presence-absence and abundance. We find that calibrating Maxent models with default settings result in the generation of overly complex models. While model performance increased with model complexity when predicting current distributions, it was higher with intermediate complexity when predicting mid-Holocene distributions. Hence, models of intermediate complexity resulted in the best trade-off to predict species distributions across time. Reliable temporal model transferability is especially relevant for forecasting species distributions under future climate change. Consequently, species-specific model tuning should be used to find the best modeling settings to control for complexity, notably with paleoecological data to independently validate model projections. For cross-temporal projections of species distributions for which paleoecological data is not available, models of intermediate complexity should be selected.
Resumo:
Tutkielman tarkoituksena oli mallintaa varastonhallintajärjestelmä, joka olisi sopiva case yritykselle. Tutkimus aloitettiin case yrityksen varastonhallinan nykytilan kartoituksella, jonka jälkeen tutkittiin varastonhallinnan eri osa-alueisiin. Varastonhallinnan osa-alueista käsiteltiin varastotyyppejä, motiiveja, tavoitteita, kysynnän ennustamista sekä erilaisia varastonhallinnan työkaluja. Sen lisäksi tutkittiin erilaisia varaston täydennysmalleja. Teoriaosuudessa käsiteltiin lisäksi kolmea erilaista tietojärjestelmätyyppiä: toiminnanohjausjärjestelmää, sähköisen kaupankäynnin järjestelmää sekä räätälöityä järjestelmää. Tutkimussuunnitelmassa nämä kolme järjestelmää rajattiin vaihtoehdoiksi, joista jokin valittaisiin case yrityksen varastonhallintajärjestelmäksi. Teorian ja nykytilan pohjalta tehtiin viitekehys, jossa esiteltiin varastonhallintajärjestelmän tieto- ja toiminnallisuusominaisuuksia. Nämä ominaisuudet priorisoitiin neljään eri luokkaan ominaisuuden kriittisyyden mukaan. Järjestelmävaihtoehdot arvioitiin viitekehyksen kriteerien mukaisesti, miten helposti ominaisuudet olisivat toteutettavissa eri vaihtoehdoissa. Tulokset laskettiin näiden arviointien perusteella, jonka jälkeen tulosten analysoinnissa huomattiin, että toiminnanohjausjärjestelmä sopisi parhaiten case yrityksen varastonhallintajärjestelmäksi.