919 resultados para Analytic Reproducing Kernel
Resumo:
Efforts to ‘modernize’ the clinical workforce of the English National Health Service have sought to reconfigure the responsibilities of professional groups in pursuit of more effective, joined-up service provision. Such efforts have met resistance from professions eager to protect their jurisdictions, deploying legitimacy claims familiar from the insights of the sociology of professions. Yet to date few studies of professional boundaries have grounded these insights in the specific context of policy challenges to the inter- and intra-professional division of labour, in relation the medical profession and other health-related occupations. In this paper we address this gap by considering the experience of newly instituted general practitioners (family physicians) with a special interest (GPSIs) in genetics, introduced to improve genetics knowledge and practice in primary care. Using qualitative data from four comparative case studies, we discuss how an established intra-professional division of labour within medicine—between clinical geneticists and GPs—was opened, negotiated and reclosed in these sites. We discuss the contrasting attitudes towards the nature of genetics knowledge and its application of GPSIs and geneticists, and how these were used to advance conflicting visions of what the nascent GPSI role should involve. In particular, we show how the claims to knowledge of geneticists and GPSIs interacted with wider policy pressures to produce a rather more conservative redistribution of power and responsibility across the intra-professional boundary than the rhetoric of modernization might suggest.
Resumo:
Bahadur representation and its applications have attracted a large number of publications and presentations on a wide variety of problems. Mixing dependency is weak enough to describe the dependent structure of random variables, including observations in time series and longitudinal studies. This note proves the Bahadur representation of sample quantiles for strongly mixing random variables (including ½-mixing and Á-mixing) under very weak mixing coe±cients. As application, the asymptotic normality is derived. These results greatly improves those recently reported in literature.
Resumo:
Para entender nuestro proyecto, debemos comprender DEVS. Dentro de los formalismos más populares de representación de sistemas de eventos discretos se encuentra DES. En la década de los 70, el matemático Bernard Zeigler propuso un formalismo general para la representación de dichos sistemas. Este formalismo denominado DEVS (Discrete EVent System Specification) es el formalismo más general para el tratamiento de DES. DEVS permite representar todos aquellos sistemas cuyo comportamiento pueda describirse mediante una secuencia de eventos discretos. Estos eventos se caracterizan por un tiempo base en el que solo un número de eventos finitos puede ocurrir. DEVS Modelado y Simulación tiene múltiples implementaciones en varios lenguajes de programación como por ejemplo en Java, C# o C++. Pero surge la necesidad de implementar una plataforma distribuida estable para proporcionar la mecánica de interoperabilidad e integrar modelos DEVS diversificados. En este proyecto, se nos dará como código base el core de xDEVS en java, aplicado de forma secuencial y paralelizada. Nuestro trabajo será implementar el core de manera distribuida de tal forma que se pueda dividir un sistema DEVS en diversas máquinas. Para esto hemos utilizado sockets de java para hacer la transmisión de datos lo más eficiente posible. En un principio deberemos especificar el número de máquinas que se conectarán al servidor. Una vez estas se hayan conectado se les enviará el trabajo específico que deberán simular. Cabe destacar que hay dos formas de dividir un sistema DEVS las cuales están implementadas en nuestro proyecto. La primera es dividirlo en módulos atómicos los cuales son subsistemas indivisibles en un sistema DEVS. Y la segunda es dividir las funciones de todos los subsistemas en grupos y repartirlos entre las máquinas. En resumen el funcionamiento de nuestro sistema distribuido será comenzar ejecutando el trabajo asignado al primer cliente, una vez finalizado actualizará la información del servidor y este mandara la orden al siguiente y así sucesivamente.
Resumo:
Virtually every sector of business and industry that uses computing, including financial analysis, search engines, and electronic commerce, incorporate Big Data analysis into their business model. Sophisticated clustering algorithms are popular for deducing the nature of data by assigning labels to unlabeled data. We address two main challenges in Big Data. First, by definition, the volume of Big Data is too large to be loaded into a computer’s memory (this volume changes based on the computer used or available, but there is always a data set that is too large for any computer). Second, in real-time applications, the velocity of new incoming data prevents historical data from being stored and future data from being accessed. Therefore, we propose our Streaming Kernel Fuzzy c-Means (stKFCM) algorithm, which reduces both computational complexity and space complexity significantly. The proposed stKFCM only requires O(n2) memory where n is the (predetermined) size of a data subset (or data chunk) at each time step, which makes this algorithm truly scalable (as n can be chosen based on the available memory). Furthermore, only 2n2 elements of the full N × N (where N >> n) kernel matrix need to be calculated at each time-step, thus reducing both the computation time in producing the kernel elements and also the complexity of the FCM algorithm. Empirical results show that stKFCM, even with relatively very small n, can provide clustering performance as accurately as kernel fuzzy c-means run on the entire data set while achieving a significant speedup.
Resumo:
Previous research with the ratio-bias task found larger response latencies for conflict trials where the heuristic- and analytic-based responses are assumed to be in opposition (e.g., choosing between 1/10 and 9/100 ratios of success) when compared to no-conflict trials where both processes converge on the same response (e.g., choosing between 1/10 and 11/100). This pattern is consistent with parallel dualprocess models, which assume that there is effective, rather than lax, monitoring of the output of heuristic processing. It is, however, unclear why conflict resolution sometimes fails. Ratio-biased choices may increase because of a decline in analytical reasoning (leaving heuristic-based responses unopposed) or to a rise in heuristic processing (making it more difficult for analytic processes to override the heuristic preferences). Using the process-dissociation procedure, we found that instructions to respond logically and response speed affected analytic (controlled) processing (C), leaving heuristic processing (H) unchanged, whereas the intuitive preference for large nominators (as assessed by responses to equal ratio trials) affected H but not C. These findings create new challenges to the debate between dual-process and singleprocess accounts, which are discussed.
Resumo:
Over the last decade, social media has become a hot topic for researchers of collaborative technologies (e.g., CSCW). The pervasive use of social media in our everyday lives provides a ready source of naturalistic data for researchers to empirically examine the complexities of the social world. In this talk I outline a different perspective informed by ethnomethodology and conversation analysis (EMCA) - an orientation that has been influential within CSCW, yet has only rarely been applied to social media use. EMCA approaches can complement existing perspectives through articulating how social media is embedded in everyday life, and how its social organisation is achieved by users of social media. Outlining a possible programme of research, I draw on a corpus of screen and ambient audio recordings of mobile device use to show how EMCA research can be generative for understanding social media through concepts such as adjacency pairs, sequential context, turn allocation / speaker selection, and repair. In doing so, I also raise questions about existing studies of social media use and the way they characterise interactional phenomena.
Resumo:
In this thesis we study the heat kernel, a useful tool to analyze various properties of different quantum field theories. In particular, we focus on the study of the one-loop effective action and the application of worldline path integrals to derive perturbatively the heat kernel coefficients for the Proca theory of massive vector fields. It turns out that the worldline path integral method encounters some difficulties if the differential operator of the heat kernel is of non-minimal kind. More precisely, a direct recasting of the differential operator in terms of worldline path integrals, produces in the classical action a non-perturbative vertex and the path integral cannot be solved. In this work we wish to find ways to circumvent this issue and to give a suggestion to solve similar problems in other contexts.
Resumo:
In this thesis project, I present stationary models of rotating fluids with toroidal distributions that can be used to represent the active galactic nuclei (AGN) central obscurers, i.e. molecular tori (Combes et al., 2019), as well as geometrically thick accretion discs, like ADAF discs (Narayan and Yi, 1995) or Polish doughnuts (Abramowicz, 2005). In particular, I study stationary rotating systems with a more general baroclinic distribution (with a vertical gradient of the angular velocity), which are often more realistic and less studied, due to their complexity, than the barotropic ones (with cylindrical rotation), which are easier to construct. In the thesis, I compute analytically the main intrinsic and projected properties of the power-law tori based on the potential-density pairs of Ciotti and Bertin (2005). I study the density distribution and the resulting gravitational potential for different values of α, in the range 2 < α < 5. For the same models, I compute the surface density of the systems when seen face-on and edge-on. I then apply the stationary Euler equations to obtain rotational velocity and temperature distributions of the self-gravitating models in the absence of an external gravitational potential. In the thesis I also consider the power-law tori with the presence of a central black hole in addition to the gas self-gravity, and solving analytically the stationary Euler equations, I compute how the properties of the system are modified by the black hole and how they vary as a function of the black hole mass. Finally, applying the Solberg-Høiland criterion, I show that these baroclinic stationary models are linearly stable in the absence of the black hole. In the presence of the black hole I derive the analytical condition for stability, which depends on α and on the black hole mass. I also study the stability of the tori in the hypothesis that they are weakly magnetized, finding that they are always unstable to this instability.
Resumo:
Di fronte alla concorrenza globale, la sopravvivenza di un'azienda manifatturiera dipende sempre più da come essa può progettare, gestire e strutturare al meglio il proprio sistema di produzione per far fronte alla diversità dei prodotti, per migliorare l'affidabilità di consegna e anche per ridurre i costi. In questo contesto, le aziende manifatturiere utilizzano spesso sistemi di produzione diversi, in base a ciò che richiede il mercato. Molto in generale, i sistemi produttivi possono essere classificati in due categorie principali: make-to-stock (MTS) e make-to-order (MTO), in base alla politica di risposta alla domanda del mercato. Nel nuovo contesto competitivo le aziende si sono trovate a dover produrre costantemente prodotti specifici e di alta qualità con costi unitari bassi e livelli di servizio elevati (ossia, tempi di consegna brevi). È chiaro, dunque, che una delle principali decisioni strategiche da prendere da parte delle aziende sia quella relativa alla ripartizione dei prodotti in MTS/MTO, ovvero quale prodotto o famiglia di prodotti può essere fabbricato per essere stoccato a magazzino (MTS), quale può essere prodotto su ordinazione (MTO) e quale dovrebbe essere fabbricato in base alla politica di produzione ibrida MTS/MTO. Gli ultimi anni hanno mostrato una serie di cambiamenti nella politica di produzione delle aziende, che si stanno gradualmente spostando sempre più verso la modalità̀ di produzione ibrida MTS/MTO. In particolare, questo elaborato si concentrerà sul delayed product differentiation (DPD), una particolare strategia produttiva ibrida, e ne verrà proposto un modello decisionale basato sul funzionamento dell’Analytic Network Process (ANP) implementato attraverso il software Superdecisions.
Resumo:
Cardboard packing for horticultural products has as main function to protect them. The design of a cardboard packing request the knowledge of the bending stiffens which is depending on the modulus of elasticity. The objective of this work was to calculate the cardboard modulus of elasticity from data obtained in laboratory using physical characterization test, with different methods, and comparing the results with the values obtained experimentally. Ten samples of each cardboard selected for this study were tested in the paper fabrication direction and in its transverse direction. The papers liner and medium resistance to the traction, used to calculate the bending stiffness, was determined in a universal machine test. To obtaining of the bending stiffens the four points test was accomplished. Expressive variations among the methods from which the modulus of elasticity is obtained were observed and that influence the bending stiffness of the structure. The stiffness values obtained experimentally were always greater than the values obtained from analytical method. This difference can be attributed to two factors, the production processes that assurance a larger rigidity than the components separately and the addition of the adhesive layer that is not taken in consideration in the analytic calculations.
Resumo:
Shelled, roasted and salted cashew nut kernels were packaged in three different flexible materials (PP/PE= polypropylene / polyethylene; PETmet/PE= metallized polyethylene terephthalate / polyethylene; PET/Al/LDPE= polyethylene terephthalate / aluminum foil / low density polyethylene ), with different barrier properties. Kernels were stored for one year at 30° C and 80% relative humidity. Quantitative descriptive sensory analysis (QDA) were performed at the end of storage time. Descriptive terms obtained for kernels characterization were brown color, color uniformity and rugosity for appearance; toasted kernel, sweet, old and rancidity for odor; toasted kernel, sweet, old rancidity, salt and bitter for taste, crispness for texture. QDA showed that factors responsible for sensory quality decrease, after one year storage, were increase in old aroma and taste, increase in rancidity aroma and taste, decrease in roasted kernel aroma and taste, and decrease of crispness. Sensory quality decrease was higher in kernels packaged in PP/PE.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas. Faculdade de Educação Física