843 resultados para on off phenomenon
Resumo:
Physiological signals, which are controlled by the autonomic nervous system (ANS), could be used to detect the affective state of computer users and therefore find applications in medicine and engineering. The Pupil Diameter (PD) seems to provide a strong indication of the affective state, as found by previous research, but it has not been investigated fully yet. In this study, new approaches based on monitoring and processing the PD signal for off-line and on-line affective assessment (“relaxation” vs. “stress”) are proposed. Wavelet denoising and Kalman filtering methods are first used to remove abrupt changes in the raw Pupil Diameter (PD) signal. Then three features (PDmean, PDmax and PDWalsh) are extracted from the preprocessed PD signal for the affective state classification. In order to select more relevant and reliable physiological data for further analysis, two types of data selection methods are applied, which are based on the paired t-test and subject self-evaluation, respectively. In addition, five different kinds of the classifiers are implemented on the selected data, which achieve average accuracies up to 86.43% and 87.20%, respectively. Finally, the receiver operating characteristic (ROC) curve is utilized to investigate the discriminating potential of each individual feature by evaluation of the area under the ROC curve, which reaches values above 0.90. For the on-line affective assessment, a hard threshold is implemented first in order to remove the eye blinks from the PD signal and then a moving average window is utilized to obtain the representative value PDr for every one-second time interval of PD. There are three main steps for the on-line affective assessment algorithm, which are preparation, feature-based decision voting and affective determination. The final results show that the accuracies are 72.30% and 73.55% for the data subsets, which were respectively chosen using two types of data selection methods (paired t-test and subject self-evaluation). In order to further analyze the efficiency of affective recognition through the PD signal, the Galvanic Skin Response (GSR) was also monitored and processed. The highest affective assessment classification rate obtained from GSR processing is only 63.57% (based on the off-line processing algorithm). The overall results confirm that the PD signal should be considered as one of the most powerful physiological signals to involve in future automated real-time affective recognition systems, especially for detecting the “relaxation” vs. “stress” states.
Resumo:
This dissertation presents a thick ethnography that engages in the micro-analysis of the situationality of black middle-class collective identification processes through an examination of performances by members of the nine historically black sororities and fraternities at Atlanta Greek Picnic, an annual festival that occurs at the beginning of June in Atlanta, Georgia. It mainly attracts undergraduate and graduate members of these university-based organizations, as they exist all over the United States. This exploration of black Greek-letter organization (BGLO) performances uncovers processes through which young black middle-class individuals attempt to combine two universes that are at first glance in complete opposition to each other: the domain of the traditional black middle-class values with representations and fashions stemming from black popular culture. These constructions also attempt to incorporate—in a contradiction of sorts— black popular cultural elements in the objective to deconstruct the social conservatism that characterizes middle-class values, particularly in relation to sexuality and its representation in social behaviors and performances. This negotiation between prescribed v middle-class values of respectability and black popular culture provides a space wherein black individuals challenge and/or perpetuate those dominant tropes through identity performances that feed into the formation of black sexual politics, which I examine through a variety of BGLO staged and non-staged performances. ^
Resumo:
Three hatchery produced and reared (HPR) and five wild white sea bream (Diplodus sargus) were double tagged with Vemco V8SC-2L acoustic transmitters and Floy Tag T-bar anchor tags, and released on artificial reefs located near a natural reef off the southern coast of Portugal. Passive telemetry was used to monitor movements of the white sea bream over a nine week period from April to June 2007. Differences in behavior at release, habitat association (artificial vs. natural reef), and in daily movements were registered. Wild fish moved from one habitat to the other with increased preference for the artificial habitat during the day, whereas HPR fish showed no site fidelity or consistent daily movement pattern and left the release site soon after release. Comparison of Minimum Convex Polygon (MCP) showed a higher area usage by wild fish. This experiment shows that these artificial reefs are used on a daily basis by wild white sea bream but apparently are not optimal release locations for hatchery produced white sea bream.
Resumo:
Deep water sharks are commonly caught as by-catch of longlines targeting bony fishes and trawlers targeting crustaceans in deep water off the southern Portuguese coast. Due to low or no commercial value, these species are most of the times discarded at sea, with only the larger specimens of some species commercialized at very low prices. In this study we present size distributions, maturity distributions, and sex ratios of 2,138 specimens belonging to four different species, namely the lantern sharks Etmopterus pusillus and Etmopterus spinax and the catsharks Galeus melastomus and Galeus atlanticus, caught with these two gears. Trawls generally caught smaller-sized specimens, in a wider length range than longlines. Trawls caught mostly immature specimens of all species, namely 83.7% immature of E. pusillus, 84.3% of E. spinax, 89.5% of G. melastomus, and 95.5% of G. atlanticus, while longlines caught mostly immature E. pusillus (69.2%) and G. melastomus (78.6%) and mostly mature E. spinax (88.2%) and G. atlanticus (87.2%). Trawls tended to catch more males than females of all species except E. spinax, while longlines caught more females than males of E. spinax and G. melastomus and more males than females of the other two species. The main conclusion of this work is that trawls are catching smaller-sized and mostly immature specimens when compared to longlines, meaning that they are probably having a more detrimental effect on these shark populations. The data presented here have significant implications for the conservation of these shark populations since sizes, sexes, and the immature and mature components of the populations are being affected differently by these two fishing gears.
Resumo:
Abstract - Mobile devices in the near future will need to collaborate to fulfill their function. Collaboration will be done by communication. We use a real world example of robotic soccer to come up with the necessary structures required for robotic communication. A review of related work is done and it is found no examples come close to providing a RANET. The robotic ad hoc network (RANET) we suggest uses existing structures pulled from the areas of wireless networks, peer to peer and software life-cycle management. Gaps are found in the existing structures so we describe how to extend some structures to satisfy the design. The RANET design supports robot cooperation by exchanging messages, discovering needed skills that other robots on the network may possess and the transfer of these skills. The network is built on top of a Bluetooth wireless network and uses JXTA to communicate and transfer skills. OSGi bundles form the skills that can be transferred. To test the nal design a reference implementation is done. Deficiencies in some third party software is found, specifically JXTA and JamVM and GNU Classpath. Lastly we look at how to fix the deciencies by porting the JXTA C implementation to the target robotic platform and potentially eliminating the TCP/IP layer, using UDP instead of TCP or using an adaptive TCP/IP stack. We also propose a future areas of investigation; how to seed the configuration for the Personal area network (PAN) Bluetooth protocol extension so a Bluetooth TCP/IP link is more quickly formed and using the STP to allow multi-hop messaging and transfer of skills.
Resumo:
The Co-operative Research Centre for Construction Innovation (CRC-CI) is funding a project known as Value Alignment Process for Project Delivery. The project consists of a study of best practice project delivery and the development of a suite of products, resources and services to guide project teams towards the best procurement approach for a specific project or group of projects. These resources will be focused on promoting the principles that underlie best practice project delivery rather than simply identifying an off-the-shelf procurement system. This project builds on earlier work by Sidwell, Kennedy and Chan (2002), on re-engineering the construction delivery process, which developed a procurement framework in the form of a Decision Matrix
Resumo:
Background The problem of silent multiple comparisons is one of the most difficult statistical problems faced by scientists. It is a particular problem for investigating a one-off cancer cluster reported to a health department because any one of hundreds, or possibly thousands, of neighbourhoods, schools, or workplaces could have reported a cluster, which could have been for any one of several types of cancer or any one of several time periods. Methods This paper contrasts the frequentist approach with a Bayesian approach for dealing with silent multiple comparisons in the context of a one-off cluster reported to a health department. Two published cluster investigations were re-analysed using the Dunn-Sidak method to adjust frequentist p-values and confidence intervals for silent multiple comparisons. Bayesian methods were based on the Gamma distribution. Results Bayesian analysis with non-informative priors produced results similar to the frequentist analysis, and suggested that both clusters represented a statistical excess. In the frequentist framework, the statistical significance of both clusters was extremely sensitive to the number of silent multiple comparisons, which can only ever be a subjective "guesstimate". The Bayesian approach is also subjective: whether there is an apparent statistical excess depends on the specified prior. Conclusion In cluster investigations, the frequentist approach is just as subjective as the Bayesian approach, but the Bayesian approach is less ambitious in that it treats the analysis as a synthesis of data and personal judgements (possibly poor ones), rather than objective reality. Bayesian analysis is (arguably) a useful tool to support complicated decision-making, because it makes the uncertainty associated with silent multiple comparisons explicit.
Resumo:
Off-site Manufacture (OSM) has long been recognised, both in Australia and internationally, as offering numerous benefits to all parties in the construction process. More importantly, it is recognised as a key vehicle for driving improvement within the construction industry. The uptake of OSM in construction is however limited, despite well documented benefits. The research aims to determine the ‘state-of-the-art’ of OSM in Australia. It confirms the benefits and identifies the real and perceived barriers to the widespread adoption of OSM. Further the project identifies opportunities for future investment and research. Although numerous reports have been produced in the UK on the state of OSM adoption within that region, no prominent studies exist for the Australian context. This scoping study is an essential component upon which to build any initiatives that can take advantage of the benefits of OSM in construction. The Construction 2020 report predicted that OSM is set to increase in use over the next 5-15 years, further justifying the need for such a study. The long-term goal of this study is to contribute to the improvement of the Australian construction industry through a realisation of the potential benefits of OSM.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
Decentralized and regional load-frequency control of power systems operating in normal and near-normal conditions has been well studied; and several analysis/synthesis approaches have been developed during the last few decades. However in contingency and off-normal conditions, the existing emergency control plans, such as under-frequency load shedding, are usually applied in a centralized structure using a different analysis model. This paper discusses the feasibility of using frequency-based emergency control schemes based on tie-line measurements and local information available within a control area. The conventional load-frequency control model is generalized by considering the dynamics of emergency control/protection schemes and an analytic approach to analyze the regional frequency response under normal and emergency conditions is presented.
Resumo:
This study employs BP neural network to simulate the development of Chinese private passenger cars. Considering the uncertain and complex environment for the development of private passenger cars, indicators of economy, population, price, infrastructure, income, energy and some other fields which have major impacts on it are selected at first. The network is proved to be operable to simulate the progress of chinese private passenger cars after modeling, training and generalization test. Based on the BP neural network model, sensitivity analysis of each indicator is carried on and shows that the sensitivity coefficients of fuel price change suddenly. This special phenomenon reveals that the development of Chinese private passenger cars may be seriously affected by the recent high fuel price. This finding is also consistent with facts and figures
Resumo:
This paper reports on an empirically based study of the Queensland (Australia) health and fitness industry over 15 years (1993 -2008). This study traces the development of the new occupation of fitness instructor in a service industry which has evolved si nce the 1980s and is embedded in values of consumption and individualism. It is the new world of work. The data from the 1993 study was historically significant, capturing the conditions o f employment in an unregulated setting prior to the introduction of the first industrial a ward in that industry in 1994. Fitness workers bargained directly with employers over all a spects of the employment relationship without the constraints of industrial regulation or the presence of trade unions. The substantive outcomes of the employment relationship were a direct reflection of m anagerial prerogative and worker orientation and preference, and did not reflect the rewards and outcomes traditionally found in Australian workplaces. While the focus of the 1993 research was on exploring the employment relationship in a deregulated environment, an unusual phenomenon was identified: fitness workers happily trading-off what would be considere d standard working conditions for the opportunity to work (‘take the stage’). Since then, several streams of literature have evolved providing a new context for understanding this phenomenon in the fitness industry, including: the sociology of the body (Shilling 1993; Turner 1996); emotional (Hochschild 1984) and aesthetic labour (Warhurst et al 2000); the so cial relations of production and space (Lefebvre 1991; Moss 1995); body history (Helps 2007); the sociology of consumption (Saunders 1988; Baudrillard 1998; Ritzer 2004); and work identity (Du Gay 1996; Strangleman 2004). The 2008 survey instrument replicated the 1993 study but was additionally informed b y the new literature. Surveys were sent to 310 commercial fitness centres and 4,800 fitness workers across Queensland. Worker orientation appears unchanged, and industry working conditions still seem atypical despite regulation si nce 1994. We argue that for many fitness workers the goal is to gain access to the fitness centre economy. For this they are willing to trade-off standard conditions of employment, and exchange traditional employm ent rewards for m ore intrinsic psycho-social rewards gained the through e xp o sure of their physical capital (Bourdieu 1984) o r bo dily prowess to the adoration o f their gazing clients. Building on the tradition of emotional labour and aesthetic labour, this study introduces the concept of ocularcentric labour: a state in which labour’s quest for the psychosocial rewards gained from their own body image shapes the employment relationship. With ocularcentric labour the p sycho-social rewards have greater value for the worker than ‘hard’, core conditions of employment, and are a significant factor in bargaining and outcomes, often substituting fo r direct earnings. The wo rkforce profile (young, female, casual) and their expectations (psycho-social rewards of ado ration and celebrity) challenge traditional trade unions in terms of what they can deliver, given the fitness workers’ willingness to trade-off minimum conditions, hard-won by unions.
Resumo:
With increasingly complex engineering assets and tight economic requirements, asset reliability becomes more crucial in Engineering Asset Management (EAM). Improving the reliability of systems has always been a major aim of EAM. Reliability assessment using degradation data has become a significant approach to evaluate the reliability and safety of critical systems. Degradation data often provide more information than failure time data for assessing reliability and predicting the remnant life of systems. In general, degradation is the reduction in performance, reliability, and life span of assets. Many failure mechanisms can be traced to an underlying degradation process. Degradation phenomenon is a kind of stochastic process; therefore, it could be modelled in several approaches. Degradation modelling techniques have generated a great amount of research in reliability field. While degradation models play a significant role in reliability analysis, there are few review papers on that. This paper presents a review of the existing literature on commonly used degradation models in reliability analysis. The current research and developments in degradation models are reviewed and summarised in this paper. This study synthesises these models and classifies them in certain groups. Additionally, it attempts to identify the merits, limitations, and applications of each model. It provides potential applications of these degradation models in asset health and reliability prediction.