995 resultados para switched-mode welding machine
Resumo:
Avalanche forecasting is a complex process involving the assimilation of multiple data sources to make predictions over varying spatial and temporal resolutions. Numerically assisted forecasting often uses nearest neighbour methods (NN), which are known to have limitations when dealing with high dimensional data. We apply Support Vector Machines to a dataset from Lochaber, Scotland to assess their applicability in avalanche forecasting. Support Vector Machines (SVMs) belong to a family of theoretically based techniques from machine learning and are designed to deal with high dimensional data. Initial experiments showed that SVMs gave results which were comparable with NN for categorical and probabilistic forecasts. Experiments utilising the ability of SVMs to deal with high dimensionality in producing a spatial forecast show promise, but require further work.
Resumo:
Although cross-sectional diffusion tensor imaging (DTI) studies revealed significant white matter changes in mild cognitive impairment (MCI), the utility of this technique in predicting further cognitive decline is debated. Thirty-five healthy controls (HC) and 67 MCI subjects with DTI baseline data were neuropsychologically assessed at one year. Among them, there were 40 stable (sMCI; 9 single domain amnestic, 7 single domain frontal, 24 multiple domain) and 27 were progressive (pMCI; 7 single domain amnestic, 4 single domain frontal, 16 multiple domain). Fractional anisotropy (FA) and longitudinal, radial, and mean diffusivity were measured using Tract-Based Spatial Statistics. Statistics included group comparisons and individual classification of MCI cases using support vector machines (SVM). FA was significantly higher in HC compared to MCI in a distributed network including the ventral part of the corpus callosum, right temporal and frontal pathways. There were no significant group-level differences between sMCI versus pMCI or between MCI subtypes after correction for multiple comparisons. However, SVM analysis allowed for an individual classification with accuracies up to 91.4% (HC versus MCI) and 98.4% (sMCI versus pMCI). When considering the MCI subgroups separately, the minimum SVM classification accuracy for stable versus progressive cognitive decline was 97.5% in the multiple domain MCI group. SVM analysis of DTI data provided highly accurate individual classification of stable versus progressive MCI regardless of MCI subtype, indicating that this method may become an easily applicable tool for early individual detection of MCI subjects evolving to dementia.
Resumo:
Proponents of microalgae biofuel technologies often claim that the world demand of liquid fuels, about 5 trillion liters per year, could be supplied by microalgae cultivated on only a few tens of millions of hectares. This perspective reviews this subject and points out that such projections are greatly exaggerated, because (1) the pro- ductivities achieved in large-scale commercial microalgae production systems, operated year-round, do not surpass those of irrigated tropical crops; (2) cultivating, harvesting and processing microalgae solely for the production of biofuels is simply too expensive using current or prospective technology; and (3) currently available (limited) data suggest that the energy balance of algal biofuels is very poor. Thus, microalgal biofuels are no panacea for depleting oil or global warming, and are unlikely to save the internal combustion machine.
Resumo:
Recent studies have shown that in humans the germinal center reactions produce three types of V(D)J mutated B cells in similar proportions, i.e. Ig-switched, IgD-IgM+ (IgM-only) and IgD+IgM+ cells, and that together they form the CD27+ compartment of recirculating B cells. We investigated the Ig isotype switch capacity of these cells. Peripheral blood B subsets were sorted and IgG subclass secretion in presence or absence of IL-4 was compared in B cell assays which lead to Ig secretion in all (coculture with EL-4 thymoma cells) or only in CD27+ (CD40L stimulation) B cells. Already switched IgG+ B cells showed no significant sequential switch and IgM-only cells also had a low switch capacity, but IgD+CD27+ switched as much as IgD+CD27- B cells to all IgG subclasses. Thus, in switched B cells some alterations compromising further switch options occur frequently; IgM-only cells may result from aborted switch. However, IgD+CD27+ human B cells, extensively V(D)J mutated and "naive" regarding switch, build up a repertoire of B cells combining (1) novel cross-reactive specificities, (2) increased differentiation capacity (including after T-independent stimulation by Staphylococcus aureus Cowan I) and (3) the capacity to produce appropriate isotypes when they respond to novel pathogens.
Resumo:
Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.
Resumo:
Type 1 diabetic patients depend on external insulin delivery to keep their blood glucose within near-normal ranges. In this work, two robust closed-loop controllers for blood glucose regulation are developed to prevent the life-threatening hypoglycemia, as well as to avoid extended hyperglycemia. The proposed controllers are designed by using the sliding mode control technique in a Smith predictor structure. To improve meal disturbance rejection, a simple feedforward controller is added to inject meal-time insulin bolus. Simulations scenarios were used to test the controllers, and showed the controllers ability to maintain the glucose levels within the safe limits in the presence of errors in measurements, modeling and meal estimation
Resumo:
L'objectif principal de cette thèse consiste à mettre en évidence la persistance du capitalisme familial en Suisse au cours du 20e siècle, et sa résistance aux capitalismes managérial et financier qui sont censés lui avoir succédé. Pour ce faire, nous avons retenu vingt-deux grandes entreprises du secteur des machines, de l'électrotechnique et de la métallurgie - principale branche de l'industrie suisse pour la période considérée -, pour lesquelles ont été recensés les membres des conseils d'administration et les principaux dirigeants exécutifs pour cinq dates- repère couvrant le siècle (1910, 1937, 1957, 1980 et 2000). Cette thèse s'inscrit dans une démarche pluridisciplinaire qui relève à la fois de l'histoire d'entreprise et de la sociologie des dirigeants, et fait appel à différentes méthodes telles que l'analyse de réseau et l'analyse prosopographique. Elle s'articule autour de trois axes de recherche principaux : le premier vise à mettre en évidence l'évolution des modes de gouvernance dans notre groupe d'entreprises, le second investit la question de la coordination patronale et le troisième a pour but de dresser un portrait collectif des élites à la tête de nos vingt-deux firmes. Nos résultats montrent que durant la majeure partie du siècle, la plupart de nos entreprises sont contrôlées par des familles et fonctionnent sur un mode de coordination hors marché qui repose notamment sur un réseau dense de liens interfirmes, le profil des dirigeants restant dans l'ensemble stable. Si la fin du siècle est marquée par plusieurs changements qui confirment l'avènement d'un capitalisme dit financier ou actionnarial et la mise en place de pratiques plus concurrentielles parmi les firmes et les élites industrielles, le maintien du contrôle familial dans plusieurs entreprises et la persistance de certains anciens mécanismes de coopération nous incitent cependant à nuancer ce constat. - The main objective of this research is to highlight the persistence of family capitalism in Switzerland during the 20th century and its resistance to managerial and financial capitalisms that succeeded. For this purpose, we focus on twenty- two big companies of the machine, electrotechnical and metallurgy sector - the main branch of the Swiss industry for the considered period - whose boards of directors and executive managers have been identified for five benchmarks across the century (1910, 1937, 1957, 1980 and 2000). This thesis relates to business history and elites sociology, and uses different methods such as network analysis and prosopography. It is articulated around three main parts. The aim of the first one is to identify the evolution of corporate governance in our twenty-two enterprises, the second part concentrates on interfirms coordination and the objective of the last one is to highlight the profile of the corporate elite leading our firms. Our results show that during the main part of the century, most of the companies were controlled by families and were characterized by non-market mechanisms of coordination such as interlocking directorates ; moreover, the profile of the corporate elite remained very stable. Although some major changes that took place by the end of the century confirmed a transition towards financial capitalism and more competitive interaction among firms and the corporate elite, the persistence of family control in several companies and the maintaining of some former mechanisms of coordination allow us to put this evolution into perspective.
Resumo:
The book presents the state of the art in machine learning algorithms (artificial neural networks of different architectures, support vector machines, etc.) as applied to the classification and mapping of spatially distributed environmental data. Basic geostatistical algorithms are presented as well. New trends in machine learning and their application to spatial data are given, and real case studies based on environmental and pollution data are carried out. The book provides a CD-ROM with the Machine Learning Office software, including sample sets of data, that will allow both students and researchers to put the concepts rapidly to practice.
Resumo:
En aquest article es presenta una selecció d’un corpus extret del Pantagruel, en el qual els personatges de Rabelais ataquen o afalaguen, grotescament i hiperbòlica, en el més pur estil pantagruèlic, les altres nacions o viles. Per a aquests epítets ètnics, Rabelais reutilitza el blasó popular (blason populaire), en una fusió total de la lloança i la injúria. En el nostre treball comprovarem si la visió sociocèntrica que el blasonador aplica al blasonat, així com el propòsit lúdic, han aconseguit ser traslladats a les versions catalana (Miquel-Àngel Sánchez Férriz, 1985), espanyola (Alicia Yllera, 2003), italiana (Mario Bonfantini, 1953) i anglesa (Burton Raffel, 1991).
Resumo:
In this work we explore the multivariate empirical mode decomposition combined with a Neural Network classifier as technique for face recognition tasks. Images are simultaneously decomposed by means of EMD and then the distance between the modes of the image and the modes of the representative image of each class is calculated using three different distance measures. Then, a neural network is trained using 10- fold cross validation in order to derive a classifier. Preliminary results (over 98 % of classification rate) are satisfactory and will justify a deep investigation on how to apply mEMD for face recognition.
Resumo:
Artifacts are present in most of the electroencephalography (EEG) recordings, making it difficult to interpret or analyze the data. In this paper a cleaning procedure based on a multivariate extension of empirical mode decomposition is used to improve the quality of the data. This is achieved by applying the cleaning method to raw EEG data. Then, a synchrony measure is applied on the raw and the clean data in order to compare the improvement of the classification rate. Two classifiers are used, linear discriminant analysis and neural networks. For both cases, the classification rate is improved about 20%.