895 resultados para Hotel


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A switching control strategy is proposed for single inductor current-fed push-pull converter with a secondary side active voltage doubler rectifier or a voltage rectifier used in photovoltaic (PV) grid interfacing. The proposed switching control strategy helps to turn-on and turn-off the primary side power switches with zero-voltage and zero-current switching. The operation of the push-pull converter is analyzed for two modes of operation. The feasibility of the proposed switching control strategy is validated using simulation and experimental results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impetus for the study reported in this paper is the Higher Education (HE) reform agenda outlined by the Vietnamese Ministry of Education and Training (MOET). The paper reports on phase one of a mixed method research; a quantitative approach using the Multifactor Leadership Questionnaire (MLQ) to investigate the Vietnamese HE leaders’ leadership styles. The MLQ survey was administered to approximately 190 senior managers in State HE institutions in Mekong Delta region in Vietnam (nine of colleges). The psychometrics of the MLQ for the Vietnamese sample confirmed the reliability and validity of the instrument with a Cronbach’s alpha of 0.779. A CFA was conducted and all factor structures were stable and consistent. The demographic variables were used to analyse patterns of leadership behaviours by the different sub-groups. The findings suggest that leaders who have different educational background and different gender in Mekong Delta region, Vietnam do not differ significantly in their perceptions about leadership factors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite significant improvements in capacity-distortion performance, a computationally efficient capacity control is still lacking in the recent watermarking schemes. In this paper, we propose an efficient capacity control framework to substantiate the notion of watermarking capacity control to be the process of maintaining “acceptable” distortion and running time, while attaining the required capacity. The necessary analysis and experimental results on the capacity control are reported to address practical aspects of the watermarking capacity problem, in dynamic (size) payload embedding.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Peggy Shaw’s RUFF, (USA 2013) and Queensland Theatre Company’s collaboration with Queensland University of Technology, Total Dik!, (Australia 2013) overtly and evocatively draw on an aestheticized use of the cinematic techniques and technologies of Chroma Key to reveal the tensions in their production and add layers to their performances. In doing so they offer invaluable insight where the filmic and theatrical approaches overlap. This paper draws on Eckersall, Grehan and Scheer’s New Media Dramaturgy (2014) to reposition the frame as a contribution to intermedial theatre and performance practices in light of increasing convergence between seemingly disparate discourses. In RUFF, the scenic environment replicates a chroma-key ‘studio’ which facilitates the reconstruction of memory displaced after a stroke. RUFF uses the screen and projections to recall crooners, lounge singers, movie stars, rock and roll bands, and an eclectic line of eccentric family members living inside Shaw. While the show pays tribute to those who have kept her company across decades of theatrical performance, use of non-composited chroma-key technique as a theatrical device and the work’s taciturn revelation of the production process during performance, play a central role in its exploration of the juxtaposition between its reconstructed form and content. In contrast Total Dik! uses real-time green screen compositing during performance as a scenic device. Actors manipulate scale models, refocus cameras and generate scenes within scenes in the construction of the work’s examination of an isolated Dictator. The ‘studio’ is again replicated as a site for (re)construction, only in this case Total Dik! actively seeks to reveal the process of production as the performance plays out. Building on RUFF, and other works such as By the Way, Meet Vera Stark, (2012) and Hotel Modern’s God’s Beard (2012), this work blends a convergence of mobile technologies, models, and green screen capture to explore aspects of transmedia storytelling in a theatrical environment (Jenkins, 2009, 2013). When a green screen is placed on stage, it reads at once as metaphor and challenge to the language of theatre. It becomes, or rather acts, as a ‘sign’ that alludes to the nature of the reconstructed, recomposited, manipulated and controlled. In RUFF and in Total Dik!, it is also a place where as a mode of production and subsequent reveal, it adds weight to performance. These works are informed by Auslander (1999) and Giesenkam (2007) and speak to and echo Lehmann’s Postdramatic Theatre (2006). This paper’s consideration of the integration of studio technique and live performance as a dynamic approach to multi-layered theatrical production develops our understanding of their combinatory use in a live performance environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional text classification technology based on machine learning and data mining techniques has made a big progress. However, it is still a big problem on how to draw an exact decision boundary between relevant and irrelevant objects in binary classification due to much uncertainty produced in the process of the traditional algorithms. The proposed model CTTC (Centroid Training for Text Classification) aims to build an uncertainty boundary to absorb as many indeterminate objects as possible so as to elevate the certainty of the relevant and irrelevant groups through the centroid clustering and training process. The clustering starts from the two training subsets labelled as relevant or irrelevant respectively to create two principal centroid vectors by which all the training samples are further separated into three groups: POS, NEG and BND, with all the indeterminate objects absorbed into the uncertain decision boundary BND. Two pairs of centroid vectors are proposed to be trained and optimized through the subsequent iterative multi-learning process, all of which are proposed to collaboratively help predict the polarities of the incoming objects thereafter. For the assessment of the proposed model, F1 and Accuracy have been chosen as the key evaluation measures. We stress the F1 measure because it can display the overall performance improvement of the final classifier better than Accuracy. A large number of experiments have been completed using the proposed model on the Reuters Corpus Volume 1 (RCV1) which is important standard dataset in the field. The experiment results show that the proposed model has significantly improved the binary text classification performance in both F1 and Accuracy compared with three other influential baseline models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inappropriate food or medication texture in patients with dysphagia is the most significant risk factor for pneumonia. Dysphagia is prevalent within care homes for the older person as it is largely found in conditions associated with ageing. This study was designed to determine the appropriateness of medication formulation choices in elderly patients with dysphagia in care homes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proposed in this paper is a low-cost, half-duplex optical communication bus for control signal isolation in modular or multilevel power electronic converters. The concept is inspired by the Local Interconnect Network (LIN) serial network protocol as used in the automotive industry. The proposed communications bus utilises readily available optical transceivers and is suitable for use with low-cost microcontrollers for distributed control of multilevel converters. As a signal isolation concept, the proposed optical bus enables very high cell count modular multilevel cascaded converters (MMCCs) for high-bandwidth, high-voltage and high-power applications. Prototype hardware is developed and the optical bus concept is validated experimentally in a 33-level MMCC converter operating at 120 Vrms and 60 Hz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the experience economy, the role of art museums has evolved so as to cater to global cultural tourists. These institutions were traditionally dedicated to didactic functions, and served cognoscenti with elite cultural tastes that were aligned with the avant-garde’s autonomous stance towards mass culture. In a post-avant-garde era however museums have focused on appealing to a broad clientele that often has little or no knowledge of historical or contemporary art. Many of these tourists want art to provide entertaining and novel experiences, rather than receiving pedagogical ‘training’. In response, art museums are turning into ‘experience venues’ and are being informed by ideas associated with new museology, as well as business approaches like Customer Experience Management. This has led to the provision of populist entertainment modes, such as blockbuster exhibitions, participatory art events, jazz nights, and wine tasting, and reveals that such museums recognize that today’s cultural tourist is part of an increasingly diverse and populous demographic, which shares many languages and value systems. As art museums have shifted attention to global tourists, they have come to play a greater role in gentrification projects and cultural precincts. The art museum now seems ideally suited to tourist-centric environments that offer a variety of immersive sensory experiences and combine museums (often designed by star-architects), international hotels, restaurants, high-end shopping zones, and other leisure forums. These include sites such as Port Maravilha urban waterfront development in Rio de Janiero, the Museum of Old and New Art in Hobart, and the Chateau La Coste winery and hotel complex in Provence. It can be argued that in a global experience economy, art museums have become experience centres in experience-scapes. This paper will examine the nature of the tourist experience in relation to the new art museum, and the latter’s increasingly important role in attracting tourists to urban and regional cultural precincts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an uncertainty quantification study of the performance analysis of the high pressure ratio single stage radial-inflow turbine used in the Sundstrand Power Systems T-100 Multi-purpose Small Power Unit. A deterministic 3D volume-averaged Computational Fluid Dynamics (CFD) solver is coupled with a non-statistical generalized Polynomial Chaos (gPC) representation based on a pseudo-spectral projection method. One of the advantages of this approach is that it does not require any modification of the CFD code for the propagation of random disturbances in the aerodynamic and geometric fields. The stochastic results highlight the importance of the blade thickness and trailing edge tip radius on the total-to-static efficiency of the turbine compared to the angular velocity and trailing edge tip length. From a theoretical point of view, the use of the gPC representation on an arbitrary grid also allows the investigation of the sensitivity of the blade thickness profiles on the turbine efficiency. The gPC approach is also applied to coupled random parameters. The results show that the most influential coupled random variables are trailing edge tip radius coupled with the angular velocity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In structural brain MRI, group differences or changes in brain structures can be detected using Tensor-Based Morphometry (TBM). This method consists of two steps: (1) a non-linear registration step, that aligns all of the images to a common template, and (2) a subsequent statistical analysis. The numerous registration methods that have recently been developed differ in their detection sensitivity when used for TBM, and detection power is paramount in epidemological studies or drug trials. We therefore developed a new fluid registration method that computes the mappings and performs statistics on them in a consistent way, providing a bridge between TBM registration and statistics. We used the Log-Euclidean framework to define a new regularizer that is a fluid extension of the Riemannian elasticity, which assures diffeomorphic transformations. This regularizer constrains the symmetrized Jacobian matrix, also called the deformation tensor. We applied our method to an MRI dataset from 40 fraternal and identical twins, to revealed voxelwise measures of average volumetric differences in brain structure for subjects with different degrees of genetic resemblance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Meta-analyses estimate a statistical effect size for a test or an analysis by combining results from multiple studies without necessarily having access to each individual study's raw data. Multi-site meta-analysis is crucial for imaging genetics, as single sites rarely have a sample size large enough to pick up effects of single genetic variants associated with brain measures. However, if raw data can be shared, combining data in a "mega-analysis" is thought to improve power and precision in estimating global effects. As part of an ENIGMA-DTI investigation, we use fractional anisotropy (FA) maps from 5 studies (total N=2, 203 subjects, aged 9-85) to estimate heritability. We combine the studies through meta-and mega-analyses as well as a mixture of the two - combining some cohorts with mega-analysis and meta-analyzing the results with those of the remaining sites. A combination of mega-and meta-approaches may boost power compared to meta-analysis alone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is a methodological paper describing when and how manifest items dropped from a latent construct measurement model (e.g., factor analysis) can be retained for additional analysis. Presented are protocols for assessment for retention in the measurement model, evaluation of dropped items as potential items separate from the latent construct, and post hoc analyses that can be conducted using all retained (manifest or latent) variables. The protocols are then applied to data relating to the impact of the NAPLAN test. The variables examined are teachers’ achievement goal orientations and teachers’ perceptions of the impact of the test on curriculum and pedagogy. It is suggested that five attributes be considered before retaining dropped manifest items for additional analyses. (1) Items can be retained when employed in service of an established or hypothesized theoretical model. (2) Items should only be retained if sufficient variance is present in the data set. (3) Items can be retained when they provide a rational segregation of the data set into subsamples (e.g., a consensus measure). (4) The value of retaining items can be assessed using latent class analysis or latent mean analysis. (5) Items should be retained only when post hoc analyses with these items produced significant and substantive results. These suggested exploratory strategies are presented so that other researchers using survey instruments might explore their data in similar and more innovative ways. Finally, suggestions for future use are provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since 2008 all Australian school students have sat standardised tests in Reading, Writing, Language Conventions (Spelling, Grammar and Punctuation) and Numeracy in years 3,5,7 and 9. NAPLAN tests report individual students' attainment of skills against a set of standards. Individual student results are communicated to parents. Schools are then ranked against other schools depending upon the aggregate of their NAPLAN results. The process is explained to parents and community members as “improving the learning outcomes for all Australian students” (MCEETYA, 2009). This paper will examine NAPLAN as it is being played out in a mediated space through analysing unsolicited comment found in new media such as Twitter and online forums. NAPLAN intersects with contemporary debates about Australian education policy: the roles schools should play in improving national productivity, the relationship between state and federal government interest in education, the role and expectations of the teacher, what curriculum and pedagogy should be and look like and how limited financial resources can best be spread across education sectors and systems. These are not new considerations, however, what has changed is that education policy seems to have become even more of a political issue than it has before. This paper uses Ball's 'toolkit' approach to education policy analysis to suggest that there are multiple 'effects' of NAPLAN culminating in a series of disconnected conversations between various stakeholders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Disconnector switch operation in GIS generates VFT voltages in the system. It is important, for insulation co-ordination purposes, to obtain accurate VFT V-t data for typical gap geometries found in GIS. This paper presents experimentally obtained VFT V-t data for a 180/1 lOmm co-axial gap. The VFT has a time to first peak of 35 ns and a oscillation frequency of 13,6 MHz. Due to the location of the voltage divider in a compartment adjacent to the gap, a correction factor of 1.1 is used to relate the measured breakdown voltage to that in the gap. Positive polarity VFT V-t data is presented for 1, 2, 3 and 4 bar absolute and negative polarity VFT data for 3 and 4 bar absolute. Two methods of generating the VFT's are used. The first is to power up the test transformer at power frequency. The second is to generate a switching impulse by discharging a capacitor into the primary of the test transformer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Different human activities like combustion of fossil fuels, biomass burning, industrial and agricultural activities, emit a large amount of particulates into the atmosphere. As a consequence, the air we inhale contains significant amount of suspended particles, including organic and inorganic solids and liquids, as well as various microorganism, which are solely responsible for a number of pulmonary diseases. Developing a numerical model for transport and deposition of foreign particles in realistic lung geometry is very challenging due to the complex geometrical structure of the human lung. In this study, we have numerically investigated the airborne particle transport and its deposition in human lung surface. In order to obtain the appropriate results of particle transport and deposition in human lung, we have generated realistic lung geometry from the CT scan obtained from a local hospital. For a more accurate approach, we have also created a mucus layer inside the geometry, adjacent to the lung surface and added all apposite mucus layer properties to the wall surface. The Lagrangian particle tracking technique is employed by using ANSYS FLUENT solver to simulate the steady-state inspiratory flow. Various injection techniques have been introduced to release the foreign particles through the inlet of the geometry. In order to investigate the effects of particle size on deposition, numerical calculations are carried out for different sizes of particles ranging from 1 micron to 10 micron. The numerical results show that particle deposition pattern is completely dependent on its initial position and in case of realistic geometry; most of the particles are deposited on the rough wall surface of the lung geometry instead of carinal region.