469 resultados para Probable Number Technique
Resumo:
To minimise the number of load sheddings in a microgrid (MG) during autonomous operation, islanded neighbour MGs can be interconnected if they are on a self-healing network and an extra generation capacity is available in the distributed energy resources (DER) of one of the MGs. In this way, the total load in the system of interconnected MGs can be shared by all the DERs within those MGs. However, for this purpose, carefully designed self-healing and supply restoration control algorithm, protection systems and communication infrastructure are required at the network and MG levels. In this study, first, a hierarchical control structure is discussed for interconnecting the neighbour autonomous MGs where the introduced primary control level is the main focus of this study. Through the developed primary control level, this study demonstrates how the parallel DERs in the system of multiple interconnected autonomous MGs can properly share the load of the system. This controller is designed such that the converter-interfaced DERs operate in a voltage-controlled mode following a decentralised power sharing algorithm based on droop control. DER converters are controlled based on a per-phase technique instead of a conventional direct-quadratic transformation technique. In addition, linear quadratic regulator-based state feedback controllers, which are more stable than conventional proportional integrator controllers, are utilised to prevent instability and weak dynamic performances of the DERs when autonomous MGs are interconnected. The efficacy of the primary control level of the DERs in the system of multiple interconnected autonomous MGs is validated through the PSCAD/EMTDC simulations considering detailed dynamic models of DERs and converters.
Resumo:
Course evaluations are now a serious matter for universities trying to meet stakeholder needs and expectations, quality assurance, improvements and strategic decision making. Typically, students are invited to participate in surveys on how well the design and delivery aspects meet predetermined learning objectives, quality of teaching, and the types of improvements needed for future deliveries. We used the Most Significant Change technique to gather data on the impact of a leadership course on 18 Pacific Islanders who completed a Master of Education (Educational Leadership). Participants' views highlighted impacts that were of significance to the students and their workplaces. The findings demonstrate that the Most Significant Change technique offers a more comprehensive understanding of the impact of leadership development courses.
Resumo:
This essay argues that the deployment of spatial metaphor in the writing of Michel Foucault is indivisible from his spatial politics. Beginning with his 1967 essay "Of Other Spaces," the development of Foucault's spatial politics and his growing awareness of the importance to his work of spatial (particularly geographic) metaphors can be charted. The focus here is not the concretisation of Foucault's early spatial obsessions—particularly with regard to the concept of "heterotopia"—into a theory or model. Rather, I am concerned with the way in which those obsessions inform Foucault's major works, in particular The Archaeology of Knowledge and Discipline and Punish. These works, I argue, do not develop a theory of space, but instead perform, through their rhetoric, a kind of spatial praxis. In this sense, Foucault's metaphors become "spatial techniques" for the practice and production of power–knowledge.
Resumo:
User evaluations using paper prototypes commonly lack social context. The Group simulation technique described in this paper offers a solution to this problem. The study introduces an early-phase participatory design technique targeted for small groups. The proposed technique is used for evaluating an interface, which enables group work in photo collection creation. Three groups of four users, 12 in total, took part in a simulation session where they tested a low-fidelity design concept that included their own personal photo content from an event that their group attended together. The users’ own content was used to evoke natural experiences. Our results indicate that the technique helped users to naturally engage with the prototype in the session. The technique is suggested to be suitable for evaluating other early-phase concepts and to guide design solutions, especially with the concepts that include users’ personal content and enable content sharing.
Resumo:
Dynamic light scattering (DLS) has become a primary nanoparticle characterization technique with applications from materials characterization to biological and environmental detection. With the expansion in DLS use from homogeneous spheres to more complicated nanostructures, comes a decrease in accuracy. Much research has been performed to develop different diffusion models that account for the vastly different structures but little attention has been given to the effect on the light scattering properties in relation to DLS. In this work, small (core size < 5 nm) core-shell nanoparticles were used as a case study to measure the capping thickness of a layer of dodecanethiol (DDT) on Au and ZnO nanoparticles by DLS. We find that the DDT shell has very little effect on the scattering properties of the inorganic core and hence can be ignored to a first approximation. However, this results in conventional DLS analysis overestimating the hydrodynamic size in the volume and number weighted distributions. By introducing a simple correction formula that more accurately yields hydrodynamic size distributions a more precise determination of the molecular shell thickness is obtained. With this correction, the measured thickness of the DDT shell was found to be 7.3 ± 0.3 Å, much less than the extended chain length of 16 Å. This organic layer thickness suggests that on small nanoparticles, the DDT monolayer adopts a compact disordered structure rather than an open ordered structure on both ZnO and Au nanoparticle surfaces. These observations are in agreement with published molecular dynamics results.
Resumo:
The position(s) of carbon-carbon double bonds within lipids can dramatically affect their structure and reactivity and thus has a direct bearing on biological function. Commonly employed mass spectrometric approaches to the characterization of complex lipids, however, fail to localize sites of unsaturation within the molecular structure and thus cannot distinguish naturally occurring regioisomers. In a recent communication \[Thomas, M. C.; Mitchell, T. W.; Blanksby, S. J. J. Am. Chem. Soc. 2006, 128, 58-59], we have presented a new technique for the elucidation of double bond position in glycerophospholipids using ozone-induced fragmentation within the source of a conventional electrospray ionization mass spectrometer. Here we report the on-line analysis, using ozone electrospray mass spectrometry (OzESI-MS), of a broad range of common unsaturated lipids including acidic and neutral glycerophospholipids, sphingomyelins, and triacylglycerols. All lipids analyzed are found to form a pair of chemically induced fragment ions diagnostic of the position of each double bond(s) regardless of the polarity, the number of charges, or the adduction (e.g., \[M - H](-), \[M - 2H](2-), \[M + H](+), \[M + Na](+), \[M + NH4](+)). The ability of OzESI-MS to distinguish lipids that differ only in the position of the double bonds is demonstrated using the glycerophosphocholine standards, GPCho(9Z-18:1/9Z-18:1) and GPCho(6Z-18:1/6Z-18:1). While these regioisomers cannot be differentiated by their conventional tandem mass spectra, the OzESI-MS spectra reveal abundant fragment ions of distinctive mass-to-charge ratio (m/z). The approach is found to be sufficiently robust to be used in conjunction with the m/z 184 precursor ion scans commonly employed for the identification of phosphocholine-containing lipids in shotgun lipidomic analyses. This tandem OzESI-MS approach was used, in conjunction with conventional tandem mass spectral analysis, for the structural characterization of an unknown sphingolipid in a crude lipid extract obtained from a human lens. The OzESI-MS data confirm the presence of two regioisomers, namely, SM(d18:0/15Z-24:1) and SM(d18:0/17Z-24:1), and suggest the possible presence of a third isomer, SM(d18:0/19Z-24:1), in lower abundance. The data presented herein demonstrate that OzESI-MS is a broadly applicable, on-line approach for structure determination and, when used in conjunction with established tandem mass spectrometric methods, can provide near complete structural characterization of a range of important lipid classes. As such, OzESI-MS may provide important new insight into the molecular diversity of naturally occurring lipids.
Resumo:
The analysis of content and meta–data has long been the subject of most Twitter studies, however such research only tells part of the story of the development of Twitter as a platform. In this work, we introduce a methodology to determine the growth patterns of individual users of the platform, a technique we refer to as follower accession, and through a number of case studies consider the factors which lead to follower growth, and the identification of non–authentic followers. Finally, we consider what such an approach tells us about the history of the platform itself, and the way in which changes to the new user signup process have impacted upon users.
Resumo:
Effective machine fault prognostic technologies can lead to elimination of unscheduled downtime and increase machine useful life and consequently lead to reduction of maintenance costs as well as prevention of human casualties in real engineering asset management. This paper presents a technique for accurate assessment of the remnant life of machines based on health state probability estimation technique and historical failure knowledge embedded in the closed loop diagnostic and prognostic system. To estimate a discrete machine degradation state which can represent the complex nature of machine degradation effectively, the proposed prognostic model employed a classification algorithm which can use a number of damage sensitive features compared to conventional time series analysis techniques for accurate long-term prediction. To validate the feasibility of the proposed model, the five different level data of typical four faults from High Pressure Liquefied Natural Gas (HP-LNG) pumps were used for the comparison of intelligent diagnostic test using five different classification algorithms. In addition, two sets of impeller-rub data were analysed and employed to predict the remnant life of pump based on estimation of health state probability using the Support Vector Machine (SVM) classifier. The results obtained were very encouraging and showed that the proposed prognostics system has the potential to be used as an estimation tool for machine remnant life prediction in real life industrial applications.
Resumo:
In this study, a machine learning technique called anomaly detection is employed for wind turbine bearing fault detection. Basically, the anomaly detection algorithm is used to recognize the presence of unusual and potentially faulty data in a dataset, which contains two phases: a training phase and a testing phase. Two bearing datasets were used to validate the proposed technique, fault-seeded bearing from a test rig located at Case Western Reserve University to validate the accuracy of the anomaly detection method, and a test to failure data of bearings from the NSF I/UCR Center for Intelligent Maintenance Systems (IMS). The latter data set was used to compare anomaly detection with SVM, a previously well-known applied method, in rapidly finding the incipient faults.
Resumo:
As the boundaries between public and private, human and technology, digital and social, mediated and natural, online and offline become increasingly blurred in modern techno-social hybrid societies, sociology as a discipline needs to adapt and adopt new ways of accounting for these digital cultures. In this paper I use the social networking site Pinterest to demonstrate how people today are shaped by, and in turn shape, the digital tools they are assembled with. Digital sociology is emerging as a sociological subdiscipline that engages with the convergence of the digital and the social. However, there seems to be a focus on developing new methods for studying digital social life, yet a neglect of concrete explorations of its culture. I argue for the need for critical socio-cultural ‘thick description’ to account for the interrelations between humans and technologies in modern digitally mediated cultures.
Resumo:
Similarity solutions are carried out for flow of power law non-Newtonian fluid film on unsteady stretching surface subjected to constant heat flux. Free convection heat transfer induces thermal boundary layer within a semi-infinite layer of Boussinesq fluid. The nonlinear coupled partial differential equations (PDE) governing the flow and the boundary conditions are converted to a system of ordinary differential equations (ODE) using two-parameter groups. This technique reduces the number of independent variables by two, and finally the obtained ordinary differential equations are solved numerically for the temperature and velocity using the shooting method. The thermal and velocity boundary layers are studied by the means of Prandtl number and non-Newtonian power index plotted in curves.
Resumo:
Unbalanced or non-linear loads result in distorted stator currents and electromagnetic torque pulsations in stand-alone doubly fed induction generators (DFIGs). This study proposes the use of a proportional-integral repetitive control (PIRC) scheme so as to mitigate the levels of harmonic and unbalance at the stator terminals of the DFIG. The PIRC is structurally simpler and requires much less computation than existing methods. Analysis of the PIRC operation and the methodology to determine the control parameters is included. Simulation study as well as laboratory test measurements demonstrate clearly the effectiveness of the proposed PIRC control scheme.
Resumo:
The top-k retrieval problem aims to find the optimal set of k documents from a number of relevant documents given the user’s query. The key issue is to balance the relevance and diversity of the top-k search results. In this paper, we address this problem using Facility Location Analysis taken from Operations Research, where the locations of facilities are optimally chosen according to some criteria. We show how this analysis technique is a generalization of state-of-the-art retrieval models for diversification (such as the Modern Portfolio Theory for Information Retrieval), which treat the top-k search results like “obnoxious facilities” that should be dispersed as far as possible from each other. However, Facility Location Analysis suggests that the top-k search results could be treated like “desirable facilities” to be placed as close as possible to their customers. This leads to a new top-k retrieval model where the best representatives of the relevant documents are selected. In a series of experiments conducted on two TREC diversity collections, we show that significant improvements can be made over the current state-of-the-art through this alternative treatment of the top-k retrieval problem.
Resumo:
Data associated with germplasm collections are typically large and multivariate with a considerable number of descriptors measured on each of many accessions. Pattern analysis methods of clustering and ordination have been identified as techniques for statistically evaluating the available diversity in germplasm data. While used in many studies, the approaches have not dealt explicitly with the computational consequences of large data sets (i.e. greater than 5000 accessions). To consider the application of these techniques to germplasm evaluation data, 11328 accessions of groundnut (Arachis hypogaea L) from the International Research Institute for the Semi-Arid Tropics, Andhra Pradesh, India were examined. Data for nine quantitative descriptors measured in the rainy and post-rainy growing seasons were used. The ordination technique of principal component analysis was used to reduce the dimensionality of the germplasm data. The identification of phenotypically similar groups of accessions within large scale data via the computationally intensive hierarchical clustering techniques was not feasible and non-hierarchical techniques had to be used. Finite mixture models that maximise the likelihood of an accession belonging to a cluster were used to cluster the accessions in this collection. The patterns of response for the different growing seasons were found to be highly correlated. However, in relating the results to passport and other characterisation and evaluation descriptors, the observed patterns did not appear to be related to taxonomy or any other well known characteristics of groundnut.