982 resultados para competency approach
Resumo:
The location of previously unseen and unregistered individuals in complex camera networks from semantic descriptions is a time consuming and often inaccurate process carried out by human operators, or security staff on the ground. To promote the development and evaluation of automated semantic description based localisation systems, we present a new, publicly available, unconstrained 110 sequence database, collected from 6 stationary cameras. Each sequence contains detailed semantic information for a single search subject who appears in the clip (gender, age, height, build, hair and skin colour, clothing type, texture and colour), and between 21 and 290 frames for each clip are annotated with the target subject location (over 11,000 frames are annotated in total). A novel approach for localising a person given a semantic query is also proposed and demonstrated on this database. The proposed approach incorporates clothing colour and type (for clothing worn below the waist), as well as height and build to detect people. A method to assess the quality of candidate regions, as well as a symmetry driven approach to aid in modelling clothing on the lower half of the body, is proposed within this approach. An evaluation on the proposed dataset shows that a relative improvement in localisation accuracy of up to 21 is achieved over the baseline technique.
Resumo:
Time series classification has been extensively explored in many fields of study. Most methods are based on the historical or current information extracted from data. However, if interest is in a specific future time period, methods that directly relate to forecasts of time series are much more appropriate. An approach to time series classification is proposed based on a polarization measure of forecast densities of time series. By fitting autoregressive models, forecast replicates of each time series are obtained via the bias-corrected bootstrap, and a stationarity correction is considered when necessary. Kernel estimators are then employed to approximate forecast densities, and discrepancies of forecast densities of pairs of time series are estimated by a polarization measure, which evaluates the extent to which two densities overlap. Following the distributional properties of the polarization measure, a discriminant rule and a clustering method are proposed to conduct the supervised and unsupervised classification, respectively. The proposed methodology is applied to both simulated and real data sets, and the results show desirable properties.
Resumo:
Competency research in the rehabilitation profession and that of rehabilitation counseling in particular has an extensive pedigree. This article reviews the significant research in the field and details several of the instruments used in competency research to dat. Issues concerning the current use of competency research and the future role of such research is discussed.
Resumo:
A set system (X, F ) with X= {x 1,...,x m}) and F = {B1...,B n }, where B i ⊆ X, is called an (n, m) cover-free set system (or CF set system) if for any 1 ≤ i, j, k ≤ n and j ≠ k, |B i >2 |B j ∩ B k | +1. In this paper, we show that CF set systems can be used to construct anonymous membership broadcast schemes (or AMB schemes), allowing a center to broadcast a secret identity among a set of users in a such way that the users can verify whether or not the broadcast message contains their valid identity. Our goal is to construct (n, m) CF set systems in which for given m the value n is as large as possible. We give two constructions for CF set systems, the first one from error-correcting codes and the other from combinatorial designs. We link CF set systems to the concept of cover-free family studied by Erdös et al in early 80’s to derive bounds on parameters of CF set systems. We also discuss some possible extensions of the current work, motivated by different application.
Resumo:
Contextual factors for sustainable development such as population growth, energy, and resource availability and consumption levels, food production yield, and growth in pollution, provide numerous complex and rapidly changing education and training requirements for a variety of professions including engineering. Furthermore, these requirements may not be clearly understood or expressed by designers, governments, professional bodies or the industry. Within this context, this paper focuses on one priority area for greening the economy through sustainable development—improving energy efficiency—and discusses the complexity of capacity building needs for professionals. The paper begins by acknowledging the historical evolution of sustainability considerations, and the complexity embedded in built environment solutions. The authors propose a dual-track approach to building capacity building, with a short-term focus on improvement (i.e., making peaking challenges a priority for postgraduate education), and a long-term focus on transformational innovation (i.e., making tailing challenges a priority for undergraduate education). A case study is provided, of Australian experiences over the last decade with regard to the topic area of energy efficiency. The authors conclude with reflections on implications for the approach.
Resumo:
This article elucidates and analyzes the fundamental underlying structure of the renormalization group (RG) approach as it applies to the solution of any differential equation involving multiple scales. The amplitude equation derived through the elimination of secular terms arising from a naive perturbation expansion of the solution to these equations by the RG approach is reduced to an algebraic equation which is expressed in terms of the Thiele semi-invariants or cumulants of the eliminant sequence { Zi } i=1 . Its use is illustrated through the solution of both linear and nonlinear perturbation problems and certain results from the literature are recovered as special cases. The fundamental structure that emerges from the application of the RG approach is not the amplitude equation but the aforementioned algebraic equation. © 2008 The American Physical Society.
Resumo:
We study the multicast stream authentication problem when an opponent can drop, reorder and inject data packets into the communication channel. In this context, bandwidth limitation and fast authentication are the core concerns. Therefore any authentication scheme is to reduce as much as possible the packet overhead and the time spent at the receiver to check the authenticity of collected elements. Recently, Tartary and Wang developed a provably secure protocol with small packet overhead and a reduced number of signature verifications to be performed at the receiver. In this paper, we propose an hybrid scheme based on Tartary and Wang’s approach and Merkle hash trees. Our construction will exhibit a smaller overhead and a much faster processing at the receiver making it even more suitable for multicast than the earlier approach. As Tartary and Wang’s protocol, our construction is provably secure and allows the total recovery of the data stream despite erasures and injections occurred during transmission.
Resumo:
This paper introduces an integral approach to the study of plasma-surface interactions during the catalytic growth of selected nanostructures (NSs). This approach involves basic understanding of the plasma-specific effects in NS nucleation and growth, theoretical modelling, numerical simulations, plasma diagnostics, and surface microanalysis. Using an example of plasma-assisted growth of surface-supported single-walled carbon nanotubes, we discuss how the combination of these techniques may help improve the outcomes of the growth process. A specific focus here is on the effects of nanoscale plasma-surface interactions on the NS growth and how the available techniques may be used, both in situ and ex situ to optimize the growth process and structural parameters of NSs.
Resumo:
A simple, effective, and innovative approach based on ion-assisted self-organization is proposed to synthesize size-selected Si quantum dots (QDs) on SiC substrates at low substrate temperatures. Using hybrid numerical simulations, the formation of Si QDs through a self-organization approach is investigated by taking into account two distinct cases of Si QD formation using the ionization energy approximation theory, which considers ionized in-fluxes containing Si3+ and Si1+ ions in the presence of a microscopic nonuniform electric field induced by a variable surface bias. The results show that the highest percentage of the surface coverage by 1 and 2 nm size-selected QDs was achieved using a bias of -20 V and ions in the lowest charge state, namely, Si1+ ions in a low substrate temperature range (227-327 °C). As low substrate temperatures (≤500 °C) are desirable from a technological point of view, because (i) low-temperature deposition techniques are compatible with current thin-film Si-based solar cell fabrication and (ii) high processing temperatures can frequently cause damage to other components in electronic devices and destroy the tandem structure of Si QD-based third-generation solar cells, our results are highly relevant to the development of the third-generation all-Si tandem photovoltaic solar cells.
Resumo:
Despite the realisation of the potential implications from biosimilars is relatively recent, much has already been written about raising the awareness of differences between biosimilars and originating/ reference listed (innovator) pharmaceuticals. The European Medicines Agency has led the global charge in regulating biosimilars. Regardless of sufficient similarities across international regulations, differences do exist across jurisdictions. The consideration of regulating biosimilars demands a congruent approach across all stages: pre-registration (Australian copyright protection, patent, international obligations), registration (confidential information, international regulators, safety and efficacy), post-registration (Pharmaceutical Benefit Scheme, prescriber and dispenser awareness). Our National Medicines Policy could provide the necessary congruent framework and function for national and international regulation of biosimilars. The Policy concedes that pharmaceuticals will be affected by financial policies and trade considerations, international treaty obligations, industrial policies, education policies and the need for public-private partnerships.
Resumo:
This paper presents the details of research undertaken on the development of an energy based time equivalent approach for light gauge steel frame (LSF) walls. This research utilized an energy based time equivalent approach to obtain the fire resistance ratings (FRR) of LSF walls exposed to realistic design fires with respect to standard fire exposure [1]. It is based on the equal area concept of fire severity and relates to the amount of energy transferred to the member. The proposed method was used to predict the fire resistance of single and double plasterboard lined and externally insulated LSF walls. The predicted fire resistance ratings were compared with the results from finite element analyses and fire design rules for three different wall configurations. This paper presents the review of the available time equivalent approaches and the development of energy based time equivalent approach for the prediction of fire resistance ratings of LSF walls exposed to realistic design fires.
Resumo:
A high level of control over quantum dot (QD) properties such as size and composition during fabrication is required to precisely tune the eventual electronic properties of the QD. Nanoscale synthesis efforts and theoretical studies of electronic properties are traditionally treated quite separately. In this paper, a combinatorial approach has been taken to relate the process synthesis parameters and the electron confinement properties of the QDs. First, hybrid numerical calculations with different influx parameters for Si1-x Cx QDs were carried out to simulate the changes in carbon content x and size. Second, the ionization energy theory was applied to understand the electronic properties of Si1-x Cx QDs. Third, stoichiometric (x=0.5) silicon carbide QDs were grown by means of inductively coupled plasma-assisted rf magnetron sputtering. Finally, the effect of QD size and elemental composition were then incorporated in the ionization energy theory to explain the evolution of the Si1-x Cx photoluminescence spectra. These results are important for the development of deterministic synthesis approaches of self-assembled nanoscale quantum confinement structures.
Resumo:
A comparative study involving both experimental and numerical investigations was made to resolve a long-standing problem of understanding electron conductivity mechanism across magnetic field in low-temperature plasmas. We have calculated the plasma parameters from experimentally obtained electric field distribution, and then made a 'back' comparison with the distributions of electron energy and plasma density obtained in the experiment. This approach significantly reduces an influence of the assumption about particular phenomenology of the electron conductivity in plasma. The results of the experiment and calculations made by this technique have showed that the classical conductivity is not capable of providing realistic total current and electron energy, whereas the phenomenological anomalous Bohm mobility has demonstrated a very good agreement with the experiment. These results provide an evidence in favor of the Bohm conductivity, thus making it possible to clarify this pressing long-living question about the main driving mechanism responsible for the electron transport in low-temperature plasmas.
Resumo:
An approach is proposed and applied to five industries to prove how phenomenology can be valuable in rethinking consumer markets (Popp & Holt, 2013). The purpose of this essay is to highlight the potential implications that 'phenomenological thinking' brings for competitiveness and innovation (Sanders, 1982), hence helping managers being more innovative in their strategic marketing decisions (i.e. market creation, positioning, branding). Phenomenology is in fact a way of thinking − besides and before being a qualitative research procedure − a very practical exercise that strategic managers can master and apply in the same successful way as other scientists have already done in their fields of study (e.g. sociology, psychology, psychiatry, and anthropology). Two fundamental considerations justify this research: a lack of distinctiveness among firms due to high levels of competition and consumers no longer knowing what they want (i.e. no more needs). The authors will show how the classical mental framework generally used to study markets by practitioners appears on the one hand to be established and systematic in the life of a company, while on the other is no longer adequate to meet the needs of innovation required to survive. To the classic principles of objectivity, generality, and psycho-sociology the authors counterpose the imaginary, eidetic-phenomenological reduction, and an existential perspective. From a theoretical point of view, this paper introduces a set of functioning rules applicable to achieve innovation in any market and useful to identify cultural practices inherent in the act of consumption.
Resumo:
This paper addresses of the advanced computational technique of steel structures for both simulation capacities simultaneously; specifically, they are the higher-order element formulation with element load effect (geometric nonlinearities) as well as the refined plastic hinge method (material nonlinearities). This advanced computational technique can capture the real behaviour of a whole second-order inelastic structure, which in turn ensures the structural safety and adequacy of the structure. Therefore, the emphasis of this paper is to advocate that the advanced computational technique can replace the traditional empirical design approach. In the meantime, the practitioner should be educated how to make use of the advanced computational technique on the second-order inelastic design of a structure, as this approach is the future structural engineering design. It means the future engineer should understand the computational technique clearly; realize the behaviour of a structure with respect to the numerical analysis thoroughly; justify the numerical result correctly; especially the fool-proof ultimate finite element is yet to come, of which is competent in modelling behaviour, user-friendly in numerical modelling and versatile for all structural forms and various materials. Hence the high-quality engineer is required, who can confidently manipulate the advanced computational technique for the design of a complex structure but not vice versa.