954 resultados para chiller units


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The roots of the concept of cortical columns stretch far back into the history of neuroscience. The impulse to compartmentalise the cortex into functional units can be seen at work in the phrenology of the beginning of the nineteenth century. At the beginning of the next century Korbinian Brodmann and several others published treatises on cortical architectonics. Later, in the middle of that century, Lorente de No writes of chains of ‘reverberatory’ neurons orthogonal to the pial surface of the cortex and called them ‘elementary units of cortical activity’. This is the first hint that a columnar organisation might exist. With the advent of microelectrode recording first Vernon Mountcastle (1957) and then David Hubel and Torsten Wiesel provided evidence consistent with the idea that columns might constitute units of physiological activity. This idea was backed up in the 1970s by clever histochemical techniques and culminated in Hubel and Wiesel’s well-known ‘ice-cube’ model of the cortex and Szentogathai’s brilliant iconography. The cortical column can thus be seen as the terminus ad quem of several great lines of neuroscientific research: currents originating in phrenology and passing through cytoarchitectonics; currents originating in neurocytology and passing through Lorente de No. Famously, Huxley noted the tragedy of a beautiful hypothesis destroyed by an ugly fact. Famously, too, human visual perception is orientated toward seeing edges and demarcations when, perhaps, they are not there. Recently the concept of cortical columns has come in for the same radical criticism that undermined the architectonics of the early part of the twentieth century. Does history repeat itself? This paper reviews this history and asks the question.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a new mathematical method for improving the discrimination power of data envelopment analysis and to completely rank the efficient decision-making units (DMUs). Fuzzy concept is utilised. For this purpose, first all DMUs are evaluated with the CCR model. Thereafter, the resulted weights for each output are considered as fuzzy sets and are then converted to fuzzy numbers. The introduced model is a multi-objective linear model, endpoints of which are the highest and lowest of the weighted values. An added advantage of the model is its ability to handle the infeasibility situation sometimes faced by previously introduced models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reaction of [Re6Q8(OH)6]4- (Q = S, Se) with p-tertbutylpyridine (TBP) in water leads to neutral trans-[Re6Q8(TBP)4(OH)2] whose hydroxyl reactivity with carboxylic acid and TBP exchange reaction with functional pyridine have been investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurement of glycated haemoglobin A (HbA) provides an indication of longer-term glycaemic control. Standardisation of this test between laboratories is difficult to achieve, and most assays are currently calibrated to the values used in the Diabetes Control and Complications Trial (DCCT-aligned). With the availability of more specific reference standards it is now proposed that HbA is expressed as mmol HbA per mol of non-glycated haemoglobin. An HbA of 7% is approximately equal to 53 mmol/mol.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data Envelopment Analysis (DEA) is recognized as a modern approach to the assessment of performance of a set of homogeneous Decision Making Units (DMUs) that use similar sources to produce similar outputs. While DEA commonly is used with precise data, recently several approaches are introduced for evaluating DMUs with uncertain data. In the existing approaches many information on uncertainties are lost. For example in the defuzzification, the a-level and fuzzy ranking approaches are not considered. In the tolerance approach the inequality or equality signs are fuzzified but the fuzzy coefficients (inputs and outputs) are not treated directly. The purpose of this paper is to develop a new model to evaluate DMUs under uncertainty using Fuzzy DEA and to include a-level to the model under fuzzy environment. An example is given to illustrate this method in details.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DEA literature continues apace but software has lagged behind. This session uses suitably selected data to present newly developed software which includes many of the most recent DEA models. The software enables the user to address a variety of issues not frequently found in existing DEA software such as: -Assessments under a variety of possible assumptions of returns to scale including NIRS and NDRS; -Scale elasticity computations; -Numerous Input/Output variables and truly unlimited number of assessment units (DMUs) -Panel data analysis -Analysis of categorical data (multiple categories) -Malmquist Index and its decompositions -Computations of Supper efficiency -Automated removal of super-efficient outliers under user-specified criteria; -Graphical presentation of results -Integrated statistical tests

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores the potential for cost savings in the general Practice units of a Primary Care Trust (PCT) in the UK. We have used Data Envelopment Analysis (DEA) to identify benchmark Practices, which offer the lowest aggregate referral and drugs costs controlling for the number, age, gender, and deprivation level of the patients registered with each Practice. For the remaining, non-benchmark Practices, estimates of the potential for savings on referral and drug costs were obtained. Such savings could be delivered through a combination of the following actions: (i) reducing the levels of referrals and prescriptions without affecting their mix (£15.74 m savings were identified, representing 6.4% of total expenditure); (ii) switching between inpatient and outpatient referrals and/or drug treatment to exploit differences in their unit costs (£10.61 m savings were identified, representing 4.3% of total expenditure); (iii) seeking a different profile of referral and drug unit costs (£11.81 m savings were identified, representing 4.8% of total expenditure). © 2012 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to identify benchmark cost-efficient General Practitioner (GP) units at delivering health care in the Geriatric and General Medicine (GMG) specialty and estimate potential cost savings. The use of a single medical specialty makes it possible to reflect more accurately the medical condition of the List population of the Practice so as to contextualize its expenditure on care for patients. We use Data Envelopment Analysis (DEA) to estimate the potential for cost savings at GP units and to decompose these savings into those attributable to the reduction of resource use, to altering the mix of resources used and to those attributable to securing better resource 'prices'. The results reveal a considerable potential for savings of varying composition across GP units. © 2013 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The operation of technical processes requires increasingly advanced supervision and fault diagnostics to improve reliability and safety. This paper gives an introduction to the field of fault detection and diagnostics and has short methods classification. Growth of complexity and functional importance of inertial navigation systems leads to high losses at the equipment refusals. The paper is devoted to the INS diagnostics system development, allowing identifying the cause of malfunction. The practical realization of this system concerns a software package, performing a set of multidimensional information analysis. The project consists of three parts: subsystem for analyzing, subsystem for data collection and universal interface for open architecture realization. For a diagnostics improving in small analyzing samples new approaches based on pattern recognition algorithms voting and taking into account correlations between target and input parameters will be applied. The system now is at the development stage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Architecture and learning algorithm of self-learning spiking neural network in fuzzy clustering task are outlined. Fuzzy receptive neurons for pulse-position transformation of input data are considered. It is proposed to treat a spiking neural network in terms of classical automatic control theory apparatus based on the Laplace transform. It is shown that synapse functioning can be easily modeled by a second order damped response unit. Spiking neuron soma is presented as a threshold detection unit. Thus, the proposed fuzzy spiking neural network is an analog-digital nonlinear pulse-position dynamic system. It is demonstrated how fuzzy probabilistic and possibilistic clustering approaches can be implemented on the base of the presented spiking neural network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the major challenges in measuring efficiency in terms of resources and outcomes is the assessment of the evolution of units over time. Although Data Envelopment Analysis (DEA) has been applied for time series datasets, DEA models, by construction, form the reference set for inefficient units (lambda values) based on their distance from the efficient frontier, that is, in a spatial manner. However, when dealing with temporal datasets, the proximity in time between units should also be taken into account, since it reflects the structural resemblance among time periods of a unit that evolves. In this paper, we propose a two-stage spatiotemporal DEA approach, which captures both the spatial and temporal dimension through a multi-objective programming model. In the first stage, DEA is solved iteratively extracting for each unit only previous DMUs as peers in its reference set. In the second stage, the lambda values derived from the first stage are fed to a Multiobjective Mixed Integer Linear Programming model, which filters peers in the reference set based on weights assigned to the spatial and temporal dimension. The approach is demonstrated on a real-world example drawn from software development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 20C05, 16U60, 16S84, 15A33.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2016

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A cikkben paneladatok segítségével a magyar gabonatermesztő üzemek 2001 és 2009 közötti technikai hatékonyságát vizsgáljuk. A technikai hatékonyság szintjének becslésére egy hagyományos sztochasztikus határok modell (SFA) mellett a látens csoportok modelljét (LCM) használjuk, amely figyelembe veszi a technológiai különbségeket is. Eredményeink arra utalnak, hogy a technológiai heterogenitás fontos lehet egy olyan ágazatban is, mint a szántóföldi növénytermesztés, ahol viszonylag homogén technológiát alkalmaznak. A hagyományos, azonos technológiát feltételező és a látens osztályok modelljeinek összehasonlítása azt mutatja, hogy a gabonatermesztő üzemek technikai hatékonyságát a hagyományos modellek alábecsülhetik. _____ The article sets out to analyse the technical efficiency of Hungarian crop farms between 2001 and 2009, using panel data and employing both standard stochastic frontier analysis and the latent class model (LCM) to estimate technical efficiency. The findings suggest that technological heterogeneity plays an important role in the crop sector, though it is traditionally assumed to employ homogenous technology. A comparison of standard SFA models that assumes the technology is common to all farms and LCM estimates highlights the way the efficiency of crop farms can be underestimated using traditional SFA models.