889 resultados para call data, paradata, CATI, calling time, call scheduler, random assignment
Resumo:
Objectives The creation of more high-growth firms continues to be a key component of enterprise policy throughout the countries of the OECD. In the UK the developing enterprise policy framework highlights the importance of supporting businesses with growth potential. The difficulty, of course, is the ability of those delivering business support policies to accurately identify those businesses, especially at start-up, which will benefit from interventions and experiences an enhanced growth performance. This paper has a core objective of presenting new data on the number of high growth firms in the UK and providing an assessment of their economic significance. Approach This paper uses a specially created longitudinal firm-level database based on the Inter-Departmental Business Register (IDBR) held by the Office of National Statistics (ONS) for all private sector businesses in the UK for the period 1997-2008 to investigate the share of high-growth firms (including a sub-set of start-up more commonly referred to as gazelles) in successive cohorts of start-ups. We apply OECD definitions of high growth and gazelles to this database and are able to quantify for the first time their number (disaggregated by sector, region, size) and importance (employment and sales). Prior Work However, what is lacking at the core of this policy focus is any comprehensive statistical analysis of the scale and nature of high-growth firms in cohorts of new and established businesses. The evidence base in response to the question “Why do high-growth firms matter?” is surprisingly weak. Important work in this area has been initiated by Bartelsman et al., (2003),Hoffman and Jünge (2006) and Henreksen and Johansson (2009) but to date work in the UK has been limited (BERR, 2008b). Results We report that there are ~11,500 high growth firms in the UK in both 2005 and 2008. The share of high growth start-ups in the UK in 2005 (6.3%) was, contrary to the widely held perception in policy circles, higher than in the United States (5.2%). Of particular interest in the analysis are the growth trajectories (pattern of growth) of these firms as well as the extent to which they are restricted to technology-based or knowledge-based sectors. Implications and Value Using hitherto unused population data for the first time we have answered a fundamental research and policy question on the number and scale of high growth firms in the UK. We draw the conclusion that this ‘rare’ event does not readily lend itself to policy intervention on the grounds that the significant effort needed to identify such businesses ex ante would appear unjustified even if it was possible.
Resumo:
We present a compact, portable and low cost generic interrogation strain sensor system using a fibre Bragg grating configured in transmission mode with a vertical-cavity surface-emitting laser (VCSEL) light source and a GaAs photodetector embedded in a polymer skin. The photocurrent value is read and stored by a microcontroller. In addition, the photocurrent data is sent via Bluetooth to a computer or tablet device that can present the live data in a real time graph. With a matched grating and VCSEL, the system is able to automatically scan and lock the VCSEL to the most sensitive edge of the grating. Commercially available VCSEL and photodetector chips are thinned down to 20 µm and integrated in an ultra-thin flexible optical foil using several thin film deposition steps. A dedicated micro mirror plug is fabricated to couple the driving optoelectronics to the fibre sensors. The resulting optoelectronic package can be embedded in a thin, planar sensing sheet and the host material for this sheet is a flexible and stretchable polymer. The result is a fully embedded fibre sensing system - a photonic skin. Further investigations are currently being carried out to determine the stability and robustness of the embedded optoelectronic components. © 2012 Copyright Society of Photo-Optical Instrumentation Engineers (SPIE).
Resumo:
We consider an uncertain version of the scheduling problem to sequence set of jobs J on a single machine with minimizing the weighted total flow time, provided that processing time of a job can take on any real value from the given closed interval. It is assumed that job processing time is unknown random variable before the actual occurrence of this time, where probability distribution of such a variable between the given lower and upper bounds is unknown before scheduling. We develop the dominance relations on a set of jobs J. The necessary and sufficient conditions for a job domination may be tested in polynomial time of the number n = |J| of jobs. If there is no a domination within some subset of set J, heuristic procedure to minimize the weighted total flow time is used for sequencing the jobs from such a subset. The computational experiments for randomly generated single-machine scheduling problems with n ≤ 700 show that the developed dominance relations are quite helpful in minimizing the weighted total flow time of n jobs with uncertain processing times.
Resumo:
We present a compact, portable and low cost generic interrogation strain sensor system using a fibre Bragg grating configured in transmission mode with a vertical-cavity surface-emitting laser (VCSEL) light source and a GaAs photodetector embedded in a polymer skin. The photocurrent value is read and stored by a microcontroller. In addition, the photocurrent data is sent via Bluetooth to a computer or tablet device that can present the live data in a real time graph. With a matched grating and VCSEL, the system is able to automatically scan and lock the VCSEL to the most sensitive edge of the grating. Commercially available VCSEL and photodetector chips are thinned down to 20 µm and integrated in an ultra-thin flexible optical foil using several thin film deposition steps. A dedicated micro mirror plug is fabricated to couple the driving optoelectronics to the fibre sensors. The resulting optoelectronic package can be embedded in a thin, planar sensing sheet and the host material for this sheet is a flexible and stretchable polymer. The result is a fully embedded fibre sensing system - a photonic skin. Further investigations are currently being carried out to determine the stability and robustness of the embedded optoelectronic components. © 2012 Copyright Society of Photo-Optical Instrumentation Engineers (SPIE).
Resumo:
In the paper learning algorithm for adjusting weight coefficients of the Cascade Neo-Fuzzy Neural Network (CNFNN) in sequential mode is introduced. Concerned architecture has the similar structure with the Cascade-Correlation Learning Architecture proposed by S.E. Fahlman and C. Lebiere, but differs from it in type of artificial neurons. CNFNN consists of neo-fuzzy neurons, which can be adjusted using high-speed linear learning procedures. Proposed CNFNN is characterized by high learning rate, low size of learning sample and its operations can be described by fuzzy linguistic “if-then” rules providing “transparency” of received results, as compared with conventional neural networks. Using of online learning algorithm allows to process input data sequentially in real time mode.
Detecting Precipitation Climate Changes: An Approach Based on a Stochastic Daily Precipitation Model
Resumo:
2002 Mathematics Subject Classification: 62M10.
Resumo:
Points of transition, when major life roles undergo change, tend to be associated with an increased need for social support. The transition from adolescence to adulthood is ideal for the examination of the effect of normative stress on the development and functioning of social networks. A questionnaire was designed based on the convoy model to assess the influence of personal and situational characteristics on the utilization of support in the prediction of post-transition adjustment. Data were initially collected for a multi-ethnic sample of 741 sophomores and seniors in high school. Surveys were mailed to participants two years later, and one again the following year. The current study is based on data for 310 participants with complete data for all three time periods. A series of hierarchical regressions were conducted to compare three explanatory models of support: main effect, mediation, and moderation. A main effect model of support on post-transition adjustment was confirmed, a mediator model was not confirmed, and a moderator model was marginally confirmed. Family and friend support was related to significantly lower levels of loneliness, particularly for those with less adaptable temperaments. ^
Resumo:
This dissertation develops a new figure of merit to measure the similarity (or dissimilarity) of Gaussian distributions through a novel concept that relates the Fisher distance to the percentage of data overlap. The derivations are expanded to provide a generalized mathematical platform for determining an optimal separating boundary of Gaussian distributions in multiple dimensions. Real-world data used for implementation and in carrying out feasibility studies were provided by Beckman-Coulter. It is noted that although the data used is flow cytometric in nature, the mathematics are general in their derivation to include other types of data as long as their statistical behavior approximate Gaussian distributions. ^ Because this new figure of merit is heavily based on the statistical nature of the data, a new filtering technique is introduced to accommodate for the accumulation process involved with histogram data. When data is accumulated into a frequency histogram, the data is inherently smoothed in a linear fashion, since an averaging effect is taking place as the histogram is generated. This new filtering scheme addresses data that is accumulated in the uneven resolution of the channels of the frequency histogram. ^ The qualitative interpretation of flow cytometric data is currently a time consuming and imprecise method for evaluating histogram data. This method offers a broader spectrum of capabilities in the analysis of histograms, since the figure of merit derived in this dissertation integrates within its mathematics both a measure of similarity and the percentage of overlap between the distributions under analysis. ^
Resumo:
This research is to establish new optimization methods for pattern recognition and classification of different white blood cells in actual patient data to enhance the process of diagnosis. Beckman-Coulter Corporation supplied flow cytometry data of numerous patients that are used as training sets to exploit the different physiological characteristics of the different samples provided. The methods of Support Vector Machines (SVM) and Artificial Neural Networks (ANN) were used as promising pattern classification techniques to identify different white blood cell samples and provide information to medical doctors in the form of diagnostic references for the specific disease states, leukemia. The obtained results prove that when a neural network classifier is well configured and trained with cross-validation, it can perform better than support vector classifiers alone for this type of data. Furthermore, a new unsupervised learning algorithm---Density based Adaptive Window Clustering algorithm (DAWC) was designed to process large volumes of data for finding location of high data cluster in real-time. It reduces the computational load to ∼O(N) number of computations, and thus making the algorithm more attractive and faster than current hierarchical algorithms.
Resumo:
With advances in science and technology, computing and business intelligence (BI) systems are steadily becoming more complex with an increasing variety of heterogeneous software and hardware components. They are thus becoming progressively more difficult to monitor, manage and maintain. Traditional approaches to system management have largely relied on domain experts through a knowledge acquisition process that translates domain knowledge into operating rules and policies. It is widely acknowledged as a cumbersome, labor intensive, and error prone process, besides being difficult to keep up with the rapidly changing environments. In addition, many traditional business systems deliver primarily pre-defined historic metrics for a long-term strategic or mid-term tactical analysis, and lack the necessary flexibility to support evolving metrics or data collection for real-time operational analysis. There is thus a pressing need for automatic and efficient approaches to monitor and manage complex computing and BI systems. To realize the goal of autonomic management and enable self-management capabilities, we propose to mine system historical log data generated by computing and BI systems, and automatically extract actionable patterns from this data. This dissertation focuses on the development of different data mining techniques to extract actionable patterns from various types of log data in computing and BI systems. Four key problems—Log data categorization and event summarization, Leading indicator identification , Pattern prioritization by exploring the link structures , and Tensor model for three-way log data are studied. Case studies and comprehensive experiments on real application scenarios and datasets are conducted to show the effectiveness of our proposed approaches.
Resumo:
In recent years, wireless communication infrastructures have been widely deployed for both personal and business applications. IEEE 802.11 series Wireless Local Area Network (WLAN) standards attract lots of attention due to their low cost and high data rate. Wireless ad hoc networks which use IEEE 802.11 standards are one of hot spots of recent network research. Designing appropriate Media Access Control (MAC) layer protocols is one of the key issues for wireless ad hoc networks. ^ Existing wireless applications typically use omni-directional antennas. When using an omni-directional antenna, the gain of the antenna in all directions is the same. Due to the nature of the Distributed Coordination Function (DCF) mechanism of IEEE 802.11 standards, only one of the one-hop neighbors can send data at one time. Nodes other than the sender and the receiver must be either in idle or listening state, otherwise collisions could occur. The downside of the omni-directionality of antennas is that the spatial reuse ratio is low and the capacity of the network is considerably limited. ^ It is therefore obvious that the directional antenna has been introduced to improve spatial reutilization. As we know, a directional antenna has the following benefits. It can improve transport capacity by decreasing interference of a directional main lobe. It can increase coverage range due to a higher SINR (Signal Interference to Noise Ratio), i.e., with the same power consumption, better connectivity can be achieved. And the usage of power can be reduced, i.e., for the same coverage, a transmitter can reduce its power consumption. ^ To utilizing the advantages of directional antennas, we propose a relay-enabled MAC protocol. Two relay nodes are chosen to forward data when the channel condition of direct link from the sender to the receiver is poor. The two relay nodes can transfer data at the same time and a pipelined data transmission can be achieved by using directional antennas. The throughput can be improved significant when introducing the relay-enabled MAC protocol. ^ Besides the strong points, directional antennas also have some explicit drawbacks, such as the hidden terminal and deafness problems and the requirements of retaining location information for each node. Therefore, an omni-directional antenna should be used in some situations. The combination use of omni-directional and directional antennas leads to the problem of configuring heterogeneous antennas, i e., given a network topology and a traffic pattern, we need to find a tradeoff between using omni-directional and using directional antennas to obtain a better network performance over this configuration. ^ Directly and mathematically establishing the relationship between the network performance and the antenna configurations is extremely difficult, if not intractable. Therefore, in this research, we proposed several clustering-based methods to obtain approximate solutions for heterogeneous antennas configuration problem, which can improve network performance significantly. ^ Our proposed methods consist of two steps. The first step (i.e., clustering links) is to cluster the links into different groups based on the matrix-based system model. After being clustered, the links in the same group have similar neighborhood nodes and will use the same type of antenna. The second step (i.e., labeling links) is to decide the type of antenna for each group. For heterogeneous antennas, some groups of links will use directional antenna and others will adopt omni-directional antenna. Experiments are conducted to compare the proposed methods with existing methods. Experimental results demonstrate that our clustering-based methods can improve the network performance significantly. ^
Resumo:
Type systems for secure information flow aim to prevent a program from leaking information from H (high) to L (low) variables. Traditionally, bisimulation has been the prevalent technique for proving the soundness of such systems. This work introduces a new proof technique based on stripping and fast simulation, and shows that it can be applied in a number of cases where bisimulation fails. We present a progressive development of this technique over a representative sample of languages including a simple imperative language (core theory), a multiprocessing nondeterministic language, a probabilistic language, and a language with cryptographic primitives. In the core theory we illustrate the key concepts of this technique in a basic setting. A fast low simulation in the context of transition systems is a binary relation where simulating states can match the moves of simulated states while maintaining the equivalence of low variables; stripping is a function that removes high commands from programs. We show that we can prove secure information flow by arguing that the stripping relation is a fast low simulation. We then extend the core theory to an abstract distributed language under a nondeterministic scheduler. Next, we extend to a probabilistic language with a random assignment command; we generalize fast simulation to the setting of discrete time Markov Chains, and prove approximate probabilistic noninterference. Finally, we introduce cryptographic primitives into the probabilistic language and prove computational noninterference, provided that the underling encryption scheme is secure.
Resumo:
Taylor Slough, in Everglades National Park, has experienced an evolution of water management infrastructure since drainage activities arrived in South Florida. This has included the excavation of canals, installation of large capacity pump stations, and a variety of operational strategies focused on resolving the conflict between managing the water level for developed areas while providing water supply for Everglades National Park. This study provides a review of water management practices and the concurrent hydrologic conditions in the Taylor Slough basin and adjacent canal system from 1961 through 2010. Analyses of flow, water level and rainfall data were divided into time periods that correspond to significant changes in structural features and operational plans. In the early 1960s, Taylor Slough was disconnected from the greater Everglades system by the construction of levees upstream. As water supply for Taylor Slough became more urgent, the Slough was connected to the regional water supply system via a network of canals and pump stations to relieve over-drained conditions. The increased water supply and pump capacity succeeded in raising water level and increasing flow and hydroperiod in the marsh.
Resumo:
This thesis research describes the design and implementation of a Semantic Geographic Information System (GIS) and the creation of its spatial database. The database schema is designed and created, and all textual and spatial data are loaded into the database with the help of the Semantic DBMS's Binary Database Interface currently being developed at the FIU's High Performance Database Research Center (HPDRC). A friendly graphical user interface is created together with the other main system's areas: displaying process, data animation, and data retrieval. All these components are tightly integrated to form a novel and practical semantic GIS that has facilitated the interpretation, manipulation, analysis, and display of spatial data like: Ocean Temperature, Ozone(TOMS), and simulated SeaWiFS data. At the same time, this system has played a major role in the testing process of the HPDRC's high performance and efficient parallel Semantic DBMS.
Resumo:
The sustainable use of waste resulting from the agribusiness is currently the focus of research, especially the sugar cane bagasse (BCA), being the lignocellulosic waste produced in greater volume in the Brazilian agribusiness, where the residual biomass has been applied in production energy and bioproducts. In this paper, pulp was produced in high purity from the (BCA) by pulping soda / anthraquinone and subsequent conversion to cellulose acetate. Commercial cellulose Avicel was used for comparison. The obtained cellulose acetate was homogeneous acetylation reaction by modifying the variables, the reaction time in hours (8, 12, 16, 20 and 24) and temperature in ° C (25 and 50). FTIR spectra showed characteristic bands identical to cellulosic materials, demonstrating the efficiency of separation by pulping. The characterization of cellulose acetate was obtained and by infrared spectroscopy (FTIR), X-ray diffraction (XRD), thermogravimetric analysis (TG / DTG / DSC), scanning electron microscopy (SEM) and determining the degree of substitution (DS ) for the cellulose acetate to confirm the acetylation. The optimal reaction time for obtaining diacetates and triacetates, at both temperatures were 20 and 24 h. Cellulose acetate produced BCA presented GS between 2.57 and 2.7 at 25 ° C and 50 ° C GS obtained were 2.66 and 2.84, indicating the actual conversion of cellulose BCA of di- and triacetates. Comparative mode, commercial cellulose Avicel GS showed 2.78 and 2.76 at 25 ° C and 2.77 to 2.75 at 50 ° C. Data were collected in time of 20 h and 24 h, respectively. The best result was for the synthesis of cellulose acetate obtained from the BCA GS 2.84 to 50 ° C and 24 hours, being classified as cellulose triacetate, which showed superior result to that produced with the commercial ethyl cellulose Avicel, demonstrating converting potential of cellulose derived from a lignocellulosic residue (BCA), low cost, prospects of commercial use of cellulose acetate