859 resultados para Consumer multi-stage choice process


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research examines the entrepreneurship phenomenon, and the question: Why are some venture attempts more successful than others? This question is not a new one. Prior research has answered this by describing those that engage in nascent entrepreneurship. Yet, this approach yielded little consensus and offers little comfort for those newly considering venture creation (Gartner, 1988). Rather, this research considers the process of venture creation, by focusing on the actions of nascent entrepreneurs. However, the venture creation process is complex (Liao, Welsch, & Tan, 2005), and multi-dimensional (Davidsson, 2004). The process can vary in the amount of action engaged by the entrepreneur; the temporal dynamics of how action is enacted (Lichtenstein, Carter, Dooley, and Gartner 2007); or the sequence in which actions are undertaken. And little is known about whether any, or all three, of these dimensions matter. Further, there exists scant general knowledge about how the venture creation process influences venture creation outcomes (Gartner & Shaver, 2011). Therefore, this research conducts a systematic study of what entrepreneurs do as they create a new venture. The primary goal is to develop general principles so that advice may be offered on how to ‘proceed’, rather than how to ‘be’. Three integrated empirical studies were conducted that separately focus on each of the interrelated dimensions. The basis for this was a randomly sampled, longitudinal panel, of nascent ventures. Upon recruitment these ventures were in the process of being created, but yet to be established as new businesses. The ventures were tracked one year latter to follow up on outcomes. Accordingly, this research makes the following original contributions to knowledge. First, the findings suggest that all three of the dimensions play an important role: action, dynamics, and sequence. This implies that future research should take a multi-dimensional view of the venture creation process. Failing to do so can only result in a limited understanding of a complex phenomenon. Second, action is the fundamental means through which venture creation is achieved. Simply put, more active venture creation efforts are more likely more successful. Further, action is the medium which allows resource endowments their effect upon venture outcomes. Third, the dynamics of how venture creation plays out over time is also influential. Here, a process with a high rate of action which increases in intensity will more likely achieve positive outcomes. Forth, sequence analysis, suggests that the order in which actions are taken will also drive outcomes. Although venture creation generally flows in sequence from discovery toward exploitation (Shane & Venkataraman, 2000; Eckhardt & Shane, 2003; Shane, 2003), processes that actually proceed in this way are less likely to be realized. Instead, processes which specifically intertwine discovery and exploitation action together in symbiosis more likely achieve better outcomes (Sarasvathy, 2001; Baker, Miner, & Eesley, 2003). Further, an optimal venture creation order exists somewhere between these sequential and symbiotic process archetypes. A process which starts out as symbiotic discovery and exploitation, but switches to focus exclusively on exploitation later on is most likely to achieve venture creation. These sequence findings are unique, and suggest future integration between opposing theories for order in venture creation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Theoretical foundations of higher order spectral analysis are revisited to examine the use of time-varying bicoherence on non-stationary signals using a classical short-time Fourier approach. A methodology is developed to apply this to evoked EEG responses where a stimulus-locked time reference is available. Short-time windowed ensembles of the response at the same offset from the reference are considered as ergodic cyclostationary processes within a non-stationary random process. Bicoherence can be estimated reliably with known levels at which it is significantly different from zero and can be tracked as a function of offset from the stimulus. When this methodology is applied to multi-channel EEG, it is possible to obtain information about phase synchronization at different regions of the brain as the neural response develops. The methodology is applied to analyze evoked EEG response to flash visual stimulii to the left and right eye separately. The EEG electrode array is segmented based on bicoherence evolution with time using the mean absolute difference as a measure of dissimilarity. Segment maps confirm the importance of the occipital region in visual processing and demonstrate a link between the frontal and occipital regions during the response. Maps are constructed using bicoherence at bifrequencies that include the alpha band frequency of 8Hz as well as 4 and 20Hz. Differences are observed between responses from the left eye and the right eye, and also between subjects. The methodology shows potential as a neurological functional imaging technique that can be further developed for diagnosis and monitoring using scalp EEG which is less invasive and less expensive than magnetic resonance imaging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Widespread adoption by electricity utilities of Non-Conventional Instrument Transformers, such as optical or capacitive transducers, has been limited due to the lack of a standardised interface and multi-vendor interoperability. Low power analogue interfaces are being replaced by IEC 61850 9 2 and IEC 61869 9 digital interfaces that use Ethernet networks for communication. These ‘process bus’ connections achieve significant cost savings by simplifying connections between switchyard and control rooms; however the in-service performance when these standards are employed is largely unknown. The performance of real-time Ethernet networks and time synchronisation was assessed using a scale model of a substation automation system. The test bed was constructed from commercially available timing and protection equipment supplied by a range of vendors. Test protocols have been developed to thoroughly evaluate the performance of Ethernet networks and network based time synchronisation. The suitability of IEEE Std 1588 Precision Time Protocol (PTP) as a synchronising system for sampled values was tested in the steady state and under transient conditions. Similarly, the performance of hardened Ethernet switches designed for substation use was assessed under a range of network operating conditions. This paper presents test methods that use a precision Ethernet capture card to accurately measure PTP and network performance. These methods can be used for product selection and to assess ongoing system performance as substations age. Key findings on the behaviour of multi-function process bus networks are presented. System level tests were performed using a Real Time Digital Simulator and transformer protection relay with sampled value and Generic Object Oriented Substation Events (GOOSE) capability. These include the interactions between sampled values, PTP and GOOSE messages. Our research has demonstrated that several protocols can be used on a shared process bus, even with very high network loads. This should provide confidence that this technology is suitable for transmission substations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this chapter, the picture of Australian small business is supplemented by using data from the Comprehensive Australian Study of Entrepreneurial Emergence (CAUSEE) . This data tracks large numbers of on-going business start-ups over time. The Australian Centre of Entrepreneurship Research at Queensland University of Technology collected data in four annual waves. (Wave 1 to Wave 4) from 2007 to 2011. CAUSEE allows the analysis of entrepreneurial entrants at two stages of development, i.e. nascent and young firms. Nascent firms are defined as firms in the process of being created, but not yet established in the market, and young firms are defined as having been operational for up to four years. An analysis of nascent firms provides unique insights, as no other known Australian database captures and follows the development of business start-ups at the pre-operational stage. In addition, the project captured judgment over samples of high-potential start-ups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Immigration has played an important role in the historical development of Australia. Thus, it is no surprise that a large body of empirical work has developed, which focuses upon how migrants fare in the land of opportunity. Much of the literature is comparatively recent, i.e. the last ten years or so, encouraged by the advent of public availability of Australian crosssection micro data. Several different aspects of migrant welfare have been addressed, with major emphasis being placed upon earnings and unemployment experience. For recent examples see Haig (1980), Stromback (1984), Chiswick and Miller (1985), Tran-Nam and Nevile (1988) and Beggs and Chapman (1988). The present paper contributes to the literature by providing additional empirical evidence on the native/migrant earnings differential. The data utilised are from the rather neglected Australian Bureau of Statistics, ABS Special Supplementary Survey No.4. 1982, otherwise known as the Family Survey. The paper also examines the importance of distinguishing between the wage and salary sector and the self-employment sector when discussing native/migrant differentials. Separate earnings equations for the two labour market groups are estimated and the native/migrant earnings differential is broken down by employment status. This is a novel application in the Australian context and provides some insight into the earnings of the selfemployed, a group that despite its size (around 20 per cent of the labour force) is frequently ignored by economic research. Most previous empirical research fails to examine the effect of employment status on earnings. Stromback (1984) includes a dummy variable representing self-employment status in an earnings equation estimated over a pooled sample of paid and self-employed workers. The variable is found to be highly significant, which leads Stromback to question the efficacy of including the self-employed in the estimation sample. The suggestion is that part of self-employed earnings represent a return to non-human capital investment, i.e. investments in machinery, buildings etc, the structural determinants of earnings differ significantly from those for paid employees. Tran-Nam and Nevile (1988) deal with differences between paid employees and the selfemployed by deleting the latter from their sample. However, deleting the self-employed from the estimation sample may lead to bias in the OLS estimation method (see Heckman 1979). The desirable properties of OLS are dependent upon estimation on a random sample. Thus, the 'Ran-Nam and Nevile results are likely to suffer from bias unless individuals are randomly allocated between self-employment and paid employment. The current analysis extends Tran-Nam and Nevile (1988) by explicitly treating the choice of paid employment versus self-employment as being endogenously determined. This allows an explicit test for the appropriateness of deleting self-employed workers from the sample. Earnings equations that are corrected for sample selection are estimated for both natives and migrants in the paid employee sector. The Heckman (1979) two-step estimator is employed. The paper is divided into five major sections. The next section presents the econometric model incorporating the specification of the earnings generating process together with an explicit model determining an individual's employment status. In Section 111 the data are described. Section IV draws together the main econometric results of the paper. First, the probit estimates of the labour market status equation are documented. This is followed by presentation and discussion of the Heckman two-stage estimates of the earnings specification for both native and migrant Australians. Separate earnings equations are estimated for paid employees and the self-employed. Section V documents estimates of the nativelmigrant earnings differential for both categories of employees. To aid comparison with earlier work, the Oaxaca decomposition of the earnings differential for paid-employees is carried out for both the simple OLS regression results as well as the parameter estimates corrected for sample selection effects. These differentials are interpreted and compared with previous Australian findings. A short section concludes the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Supply chain management and customer relationship management are concepts for optimizing the provision of goods to customers. Information sharing and information estimation are key tools used to implement these two concepts. The reduction of delivery times and stock levels can be seen as the main managerial objectives of an integrative supply chain and customer relationship management. To achieve this objective, business processes need to be integrated along the entire supply chain including the end consumer. Information systems form the backbone of any business process integration. The relevant information system architectures are generally well-understood, but the conceptual specification of information systems for business process integration from a management perspective, remains an open methodological problem. To address this problem, we will show how customer relationship management and supply chain management information can be integrated at the conceptual level in order to provide supply chain managers with relevant information. We will further outline how the conceptual management perspective of business process integration can be supported by deriving specifications for enabling information system from business objectives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel Glass Fibre Reinforced Polymer (GFRP) sandwich panel was developed by an Australian manufacturer for civil engineering applications. This research is motivated by the new applications of GFRP sandwich structures in civil engineering such as slab, beam, girder and sleeper. An optimisation methodology is developed in this work to enhance the design of GFRP sandwich beams. The design of single and glue laminated GFRP sandwich beam were conducted by using numerical optimisation. The numerical multi-objective optimisation considered a design two objectives simultaneously. These objectives are cost and mass. The numerical optimisation uses the Adaptive Range Multi-objective Genetic Algorithm (ARMOGA) and Finite Element (FE) method. Trade-offs between objectives was found during the optimisation process. Multi-objective optimisation shows a core to skin mass ratio equal to 3.68 for the single sandwich beam cross section optimisation and it showed that the optimum core to skin thickness ratio is 11.0.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Particulate matter research is essential because of the well known significant adverse effects of aerosol particles on human health and the environment. In particular, identification of the origin or sources of particulate matter emissions is of paramount importance in assisting efforts to control and reduce air pollution in the atmosphere. This thesis aims to: identify the sources of particulate matter; compare pollution conditions at urban, rural and roadside receptor sites; combine information about the sources with meteorological conditions at the sites to locate the emission sources; compare sources based on particle size or mass; and ultimately, provide the basis for control and reduction in particulate matter concentrations in the atmosphere. To achieve these objectives, data was obtained from assorted local and international receptor sites over long sampling periods. The samples were analysed using Ion Beam Analysis and Scanning Mobility Particle Sizer methods to measure the particle mass with chemical composition and the particle size distribution, respectively. Advanced data analysis techniques were employed to derive information from large, complex data sets. Multi-Criteria Decision Making (MCDM), a ranking method, drew on data variability to examine the overall trends, and provided the rank ordering of the sites and years that sampling was conducted. Coupled with the receptor model Positive Matrix Factorisation (PMF), the pollution emission sources were identified and meaningful information pertinent to the prioritisation of control and reduction strategies was obtained. This thesis is presented in the thesis by publication format. It includes four refereed papers which together demonstrate a novel combination of data analysis techniques that enabled particulate matter sources to be identified and sampling site/year ranked. The strength of this source identification process was corroborated when the analysis procedure was expanded to encompass multiple receptor sites. Initially applied to identify the contributing sources at roadside and suburban sites in Brisbane, the technique was subsequently applied to three receptor sites (roadside, urban and rural) located in Hong Kong. The comparable results from these international and national sites over several sampling periods indicated similarities in source contributions between receptor site-types, irrespective of global location and suggested the need to apply these methods to air pollution investigations worldwide. Furthermore, an investigation into particle size distribution data was conducted to deduce the sources of aerosol emissions based on particle size and elemental composition. Considering the adverse effects on human health caused by small-sized particles, knowledge of particle size distribution and their elemental composition provides a different perspective on the pollution problem. This thesis clearly illustrates that the application of an innovative combination of advanced data interpretation methods to identify particulate matter sources and rank sampling sites/years provides the basis for the prioritisation of future air pollution control measures. Moreover, this study contributes significantly to knowledge based on chemical composition of airborne particulate matter in Brisbane, Australia and on the identity and plausible locations of the contributing sources. Such novel source apportionment and ranking procedures are ultimately applicable to environmental investigations worldwide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Considerate amount of research has proposed optimization-based approaches employing various vibration parameters for structural damage diagnosis. The damage detection by these methods is in fact a result of updating the analytical structural model in line with the current physical model. The feasibility of these approaches has been proven. But most of the verification has been done on simple structures, such as beams or plates. In the application on a complex structure, like steel truss bridges, a traditional optimization process will cost massive computational resources and lengthy convergence. This study presents a multi-layer genetic algorithm (ML-GA) to overcome the problem. Unlike the tedious convergence process in a conventional damage optimization process, in each layer, the proposed algorithm divides the GA’s population into groups with a less number of damage candidates; then, the converged population in each group evolves as an initial population of the next layer, where the groups merge to larger groups. In a damage detection process featuring ML-GA, as parallel computation can be implemented, the optimization performance and computational efficiency can be enhanced. In order to assess the proposed algorithm, the modal strain energy correlation (MSEC) has been considered as the objective function. Several damage scenarios of a complex steel truss bridge’s finite element model have been employed to evaluate the effectiveness and performance of ML-GA, against a conventional GA. In both single- and multiple damage scenarios, the analytical and experimental study shows that the MSEC index has achieved excellent damage indication and efficiency using the proposed ML-GA, whereas the conventional GA only converges at a local solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an organisational learning system implemented across a three year period within a multi campus tertiary library. It proposes a three stage system, framed within a reflective evidence based practice process to foster professional engagement and lifelong learning of staff.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business process management systems (BPMS) belong to a class of enterprise information systems that are characterized by the dependence on explicitly modeled process logic. Through the process logic, it is relatively easy to manage explicitly the routing and allocation of work items along a business process through the system. Inspired by the DeLone and McLean framework, we theorize that these process-aware system features are important attributes of system quality, which in turn will elevate key user evaluations such as perceived usefulness, and usage satisfaction. We examine this theoretical model using data collected from four different, mostly mature BPM system projects. Our findings validate the importance of input quality as well as allocation and routing attributes as antecedents of system quality, which, in turn, determines both usefulness and satisfaction with the system. We further demonstrate how service quality and workflow dependency are significant precursors to perceived usefulness. Our results suggest the appropriateness of a multi-dimensional conception of system quality for future research, and provide important design-oriented advice for the design and configuration of BPMSs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Classifier selection is a problem encountered by multi-biometric systems that aim to improve performance through fusion of decisions. A particular decision fusion architecture that combines multiple instances (n classifiers) and multiple samples (m attempts at each classifier) has been proposed in previous work to achieve controlled trade-off between false alarms and false rejects. Although analysis on text-dependent speaker verification has demonstrated better performance for fusion of decisions with favourable dependence compared to statistically independent decisions, the performance is not always optimal. Given a pool of instances, best performance with this architecture is obtained for certain combination of instances. Heuristic rules and diversity measures have been commonly used for classifier selection but it is shown that optimal performance is achieved for the `best combination performance' rule. As the search complexity for this rule increases exponentially with the addition of classifiers, a measure - the sequential error ratio (SER) - is proposed in this work that is specifically adapted to the characteristics of sequential fusion architecture. The proposed measure can be used to select a classifier that is most likely to produce a correct decision at each stage. Error rates for fusion of text-dependent HMM based speaker models using SER are compared with other classifier selection methodologies. SER is shown to achieve near optimal performance for sequential fusion of multiple instances with or without the use of multiple samples. The methodology applies to multiple speech utterances for telephone or internet based access control and to other systems such as multiple finger print and multiple handwriting sample based identity verification systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The experiences of people affected by cancer are at the very heart of nursing research efforts. Because much of our work is focused on understanding how to improve experiences and outcomes for people with cancer, it is easy for us to believe that our research is inherently "person centered" and thus collaborative. Let's reflect on what truly collaborative approaches to cancer nursing research could be like, and how we measure up to such goals. Collaboration between people affected by cancer (consumers) and nurses in research is much more than providing a voice for individuals as participants in a research study. Today, research governing bodies in many countries require us to seek a different kind of consumer participation, where consumers and researchers work in partnership with one another to shape decisions about research priorities, policies, and practices.1 Most granting bodies now require explanations of how consumer and community participation will occur within a study. Ethical imperatives and the concept of patient advocacy also require that we give more considered attention to what is meant by consumer involvement.2 Consumers provide perspective on what will be relevant, acceptable, feasible, and sensitive research, having lived the experience of cancer. As a result, they offer practical insights that can ensure the successful conduct and better outcomes from research. Some granting bodies now even allocate a proportion of final score or assign a "public value" weighting for a grant, to recognize the importance of consumer involvement and reflect the quality of patient involvement in all stages of the research process.3

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the electricity market environment, coordination of system reliability and economics of a power system is of great significance in determining the available transfer capability (ATC). In addition, the risks associated with uncertainties should be properly addressed in the ATC determination process for risk-benefit maximization. Against this background, it is necessary that the ATC be optimally allocated and utilized within relative security constraints. First of all, the non-sequential Monte Carlo stimulation is employed to derive the probability density distribution of ATC of designated areas incorporating uncertainty factors. Second, on the basis of that, a multi-objective optimization model is formulated to determine the multi-area ATC so as to maximize the risk-benefits. Then, the solution to the developed model is achieved by the fast non-dominated sorting (NSGA-II) algorithm, which could decrease the risk caused by uncertainties while coordinating the ATCs of different areas. Finally, the IEEE 118-bus test system is served for demonstrating the essential features of the developed model and employed algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A demo video showing the BPMVM prototype using several natural user interfaces, such as multi-touch input, full-body tracking and virtual reality.