949 resultados para PERFORMANCES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent literature has focused on realized volatility models to predict financial risk. This paper studies the benefit of explicitly modeling jumps in this class of models for value at risk (VaR) prediction. Several popular realized volatility models are compared in terms of their VaR forecasting performances through a Monte Carlo study and an analysis based on empirical data of eight Chinese stocks. The results suggest that careful modeling of jumps in realized volatility models can largely improve VaR prediction, especially for emerging markets where jumps play a stronger role than those in developed markets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The existence of Macroscopic Fundamental Diagram (MFD), which relates space-mean density and flow, has been shown in urban networks under homogeneous traffic conditions. Since MFD represents the area-wide network traffic performances, studies on perimeter control strategies and an area traffic state estimation utilizing the MFD concept has been reported. One of the key requirements for well-defined MFD is the homogeneity of the area-wide traffic condition with links of similar properties, which is not universally expected in real world. For the practical application of the MFD concept, several researchers have identified the influencing factors for network homogeneity. However, they did not explicitly take the impact of drivers’ behaviour and information provision into account, which has a significant impact on simulation outputs. This research aims to demonstrate the effect of dynamic information provision on network performance by employing the MFD as a measurement. A microscopic simulation, AIMSUN, is chosen as an experiment platform. By changing the ratio of en-route informed drivers and pre-trip informed drivers different scenarios are simulated in order to investigate how drivers’ adaptation to the traffic congestion influences the network performance with respect to the MFD shape as well as other indicators, such as total travel time. This study confirmed the impact of information provision on the MFD shape, and addressed the usefulness of the MFD for measuring the dynamic information provision benefit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inspired by the wonderful properties of some biological composites in nature, we performed molecular dynamics simulations to investigate the mechanical behavior of bicontinuous nanocomposites. Three representative types of bicontinuous composites, which have regular network, random network, and nacre inspired microstructures respectively, were studied and the results were compared with those of a honeycomb nanocomposite with only one continuous phase. It was found that the mechanical strength of nanocomposites in a given direction strongly depends on the connectivity of microstructure in that direction. Directional isotropy in mechanical strength and easy manufacturability favor the random network nanocomposites as a potentially great bioinspired composite with balanced performances. In addition, the tensile strength of random network nanocomposites is less sensitive to the interfacial failure, owing to its super high interface-to-volume ratio and random distribution of internal interfaces. The results provide a useful guideline for design and optimization of advanced nanocomposites with superior mechanical properties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Macroscopic Fundamental Diagram (MFD) relates space-mean density and flow, and the existence with dynamic features was confirmed in congested urban network with real data set from loop detectors and taxi probes. Since the MFD represents the area-wide network traffic performances, it gives foundations for perimeter control strategies and an area traffic state estimation enabling area-based network control. However, limited works have been reported on real world example from signalised arterial network. This paper fuses data from multiple sources (Bluetooth, Loops and Signals) and develops a framework for the development of the MFD for Brisbane. Existence of the MFD in Brisbane network is confirmed. Different MFDs (from whole network and several sub regions) are evaluated to discover the spatial partitioning in network performance representation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Materials used in the engineering always contain imperfections or defects which significantly affect their performances. Based on the large-scale molecular dynamics simulation and the Euler–Bernoulli beam theory, the influence from different pre-existing surface defects on the bending properties of Ag nanowires (NWs) is studied in this paper. It is found that the nonlinear-elastic deformation, as well as the flexural rigidity of the NW is insensitive to different surface defects for the studied defects in this paper. On the contrary, an evident decrease of the yield strength is observed due to the existence of defects. In-depth inspection of the deformation process reveals that, at the onset of plastic deformation, dislocation embryos initiate from the locations of surface defects, and the plastic deformation is dominated by the nucleation and propagation of partial dislocations under the considered temperature. Particularly, the generation of stair-rod partial dislocations and Lomer–Cottrell lock are normally observed for both perfect and defected NWs. The generation of these structures has thwarted attempts of the NW to an early yielding, which leads to the phenomenon that more defects does not necessarily mean a lower critical force.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scaffolding is an essential issue in tissue engineering and scaffolds should answer certain essential criteria: biocompatibility, high porosity, and important pore interconnectivity to facilitate cell migration and fluid diffusion. In this work, a modified solvent castingparticulate leaching out method is presented to produce scaffolds with spherical and interconnected pores. Sugar particles (200–300 lm and 300–500 lm) were poured through a horizontal Meker burner flame and collected below the flame. While crossing the high temperature zone, the particles melted and adopted a spherical shape. Spherical particles were compressed in plastic mold. Then, poly-L-lactic acid solution was cast in the sugar assembly. After solvent evaporation, the sugar was removed by immersing the structure into distilled water for 3 days. The obtained scaffolds presented highly spherical interconnected pores, with interconnection pathways from 10 to 100 lm. Pore interconnection was obtained without any additional step. Compression tests were carried out to evaluate the scaffold mechanical performances. Moreover, rabbit bone marrow mesenchymal stem cells were found to adhere and to proliferate in vitro in the scaffold over 21 days. This technique produced scaffold with highly spherical and interconnected pores without the use of additional organic solvents to leach out the porogen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliability of the performance of biometric identity verification systems remains a significant challenge. Individual biometric samples of the same person (identity class) are not identical at each presentation and performance degradation arises from intra-class variability and inter-class similarity. These limitations lead to false accepts and false rejects that are dependent. It is therefore difficult to reduce the rate of one type of error without increasing the other. The focus of this dissertation is to investigate a method based on classifier fusion techniques to better control the trade-off between the verification errors using text-dependent speaker verification as the test platform. A sequential classifier fusion architecture that integrates multi-instance and multisample fusion schemes is proposed. This fusion method enables a controlled trade-off between false alarms and false rejects. For statistically independent classifier decisions, analytical expressions for each type of verification error are derived using base classifier performances. As this assumption may not be always valid, these expressions are modified to incorporate the correlation between statistically dependent decisions from clients and impostors. The architecture is empirically evaluated by applying the proposed architecture for text dependent speaker verification using the Hidden Markov Model based digit dependent speaker models in each stage with multiple attempts for each digit utterance. The trade-off between the verification errors is controlled using the parameters, number of decision stages (instances) and the number of attempts at each decision stage (samples), fine-tuned on evaluation/tune set. The statistical validation of the derived expressions for error estimates is evaluated on test data. The performance of the sequential method is further demonstrated to depend on the order of the combination of digits (instances) and the nature of repetitive attempts (samples). The false rejection and false acceptance rates for proposed fusion are estimated using the base classifier performances, the variance in correlation between classifier decisions and the sequence of classifiers with favourable dependence selected using the 'Sequential Error Ratio' criteria. The error rates are better estimated by incorporating user-dependent (such as speaker-dependent thresholds and speaker-specific digit combinations) and class-dependent (such as clientimpostor dependent favourable combinations and class-error based threshold estimation) information. The proposed architecture is desirable in most of the speaker verification applications such as remote authentication, telephone and internet shopping applications. The tuning of parameters - the number of instances and samples - serve both the security and user convenience requirements of speaker-specific verification. The architecture investigated here is applicable to verification using other biometric modalities such as handwriting, fingerprints and key strokes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Managing public sector projects in Malaysia is a unique challenge. This is because of the ethical issues involved during the project procurement process. These ethical issues need attention because they will have an impact on the quality, cost and time of the project itself. The ethical issues here include conflict of interest, bid shopping, collusive tendering, bid cutting, corruption and the payment game. In 2006, 17.3% of 417 Malaysian government contract projects were considered sick due to contractors' performances that failed to conduct the project according to the project plan. Some of the sick projects from these statistics are due to the ethical issues involved. These construction projects have low quality due to the selection of the contractors, done unethically due to personal relationships instead of professional qualifications. That is why it is important to govern the project procurement processes to ensure the accountability and transparency of the decision making process to ensure that these ethical issues can be avoided. Extensive research has been conducted on the ethical issues in the tendering process or the award phase of project management. There is a lack of studies looking at the role of clients, including the government client, in relation to unethical practice in project procurement in the public sector. It is important to understand that ethical issues not only involve the contractors and suppliers but also the clients. Even though there are codes of ethics in the public sectors, ethical issues still arise. Therefore, this research develops a project governance framework (PGEDM) for ethical decision making in the Malaysian public sectors. This framework combines the ethical decision making process together with the project governance principals in guiding the public sectors with ethical decision making in project procurement. A triangulation of questionnaire survey and Delphi study was employed in this research to collect required qualitative and quantitative data. A questionnaire survey was conducted among the public officials (the practitioners) who are currently working in the procurement area in the Malaysian public sectors, in identifying the ethical behaviours and factors influencing further ethical behaviour to occur. A Delphi study was also conducted with the assistance of a panel of experts consisting of practitioners that have expertise in the area of project governance and project procurement as well as academician, which further considered the relationship and the influence of the criteria and indicators of ethical decision making (EDM) and project governance (project criteria, organisational culture, contract award criteria, individual criteria, client's requirements, government procedures and professional ethics). Through the identification and integration of the factors and EDM criteria as well as the project governance criteria and EDM steps for ethical issues, a PGEDM framework was developed to promote, and drive consistent decision outcome in project procurement in the public sector. The framework contributes significantly to ethical decision making in the project procurement process. These findings not only give benefit to the people involved in project procurement but also to the public officials in guiding them to be more accountable in handling ethical issues in the future and to have a more transparent decision making process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The intentions of the science curriculum are very often constrained by the forms of student learning that are required by, or are currently available within, the system of education. Furthermore, little attention is given to developing new approaches to assessment that would encourage these good intentions. In this chapter, we argue that achieving this broadening of the intentions of science education will require a diversity of assessment techniques and that only a profile of each student’s achievement will capture the range of intended learnings. We explore a variety of assessment modes that match some of these new aspects of science learning and that also provide students with both formative information and a more comprehensive and authentic summative profile of their performances. Our discussion is illustrated with research-based examples of assessment practice in relation to three aspects of science education that are increasingly referred to in curriculum statements as desirable human dimensions of science: context-based science education, decision-making processes and socioscientific issues and integrated science education. We conclude with some notes on what these broader kinds of assessment mean for teachers and the support they would need to include them in their day-to-day practices in the science classrooms if, and when, the mainstream of science teaching and learning takes these curricular intentions seriously.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Composed by David Bridie and Andree Greenwell with script and lyrics by Margery Forde and Michael Forde, BEHIND THE CANE was community-driven music theatre, commissioned specially as the signature work the 2011 Queensland Music Festival. Co-presented by the QMF and the Whitsunday Regional Council in association with QUT Creative Industries, BEHIND THE CANE was created with and performed by over 180 Bowen residents and told the story of the South Sea Islanders who were brought to Australia to work in the cane fields in the 19 century and the journey of their descendants through the succeeding generations, through racial discrimination and economic hardship, to the present day. The large-scale spectacle event was performed the Sound shell on the Bowen harbour foreshore to audiences of 8,000 over 3 performances and included many of the descendants in featured roles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the projected increase in older adults, the older driver population is estimated to be the fastest growing cohort of drivers among many developed countries. The increased physical fragility associated with the aging process make older adults who drive private automobiles a vulnerable road user group. Much of the current research on older drivers’ behaviours and practices rely on self-report data. This paper explores the utility of in-vehicle devices (Global Positioning Systems and recording accelerometers) in assessing older drivers’ habitual driving behaviours. Seventy-eight older drivers (above 65 years of age), from the Australian Capital Territory, Australia, participated in the current study. The driving behaviours and practices of these participants were prospectively assessed over a two-week period. The use of combined GPS and recording accelerometers to improve understanding of older drivers’ driving behaviours show promise within the current study. The challenges of using multiple in-vehicle devices in assessing driving beahaviours and performances within this cohort will be discussed. Based on the current findings, recommendations for future research regarding the use of in-vehicle devices among the older driver cohort are proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The huge amount of CCTV footage available makes it very burdensome to process these videos manually through human operators. This has made automated processing of video footage through computer vision technologies necessary. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned ‘normal’ model. There is no precise and exact definition for an abnormal activity; it is dependent on the context of the scene. Hence there is a requirement for different feature sets to detect different kinds of abnormal activities. In this work we evaluate the performance of different state of the art features to detect the presence of the abnormal objects in the scene. These include optical flow vectors to detect motion related anomalies, textures of optical flow and image textures to detect the presence of abnormal objects. These extracted features in different combinations are modeled using different state of the art models such as Gaussian mixture model(GMM) and Semi- 2D Hidden Markov model(HMM) to analyse the performances. Further we apply perspective normalization to the extracted features to compensate for perspective distortion due to the distance between the camera and objects of consideration. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diagnostics of rotating machinery has developed significantly in the last decades, and industrial applications are spreading in different sectors. Most applications are characterized by varying velocities of the shaft and in many cases transients are the most critical to monitor. In these variable speed conditions, fault symptoms are clearer in the angular/order domains than in the common time/frequency ones. In the past, this issue was often solved by synchronously sampling data by means of phase locked circuits governing the acquisition; however, thanks to the spread of cheap and powerful microprocessors, this procedure is nowadays rarer; sampling is usually performed at constant time intervals, and the conversion to the order domain is made by means of digital signal processing techniques. In the last decades different algorithms have been proposed for the extraction of an order spectrum from a signal sampled asynchronously with respect to the shaft rotational velocity; many of them (the so called computed order tracking family) use interpolation techniques to resample the signal at constant angular increments, followed by a common discrete Fourier transform to shift from the angular domain to the order domain. A less exploited family of techniques shifts directly from the time domain to the order spectrum, by means of modified Fourier transforms. This paper proposes a new transform, named velocity synchronous discrete Fourier transform, which takes advantage of the instantaneous velocity to improve the quality of its result, reaching performances that can challenge the computed order tracking.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diagnostics of rolling element bearings is usually performed by means of vibration signals measured by accelerometers placed in the proximity of the bearing under investigation. The aim is to monitor the integrity of the bearing components, in order to avoid catastrophic failures, or to implement condition based maintenance strategies. In particular, the trend in this field is to combine in a single algorithm different signal-enhancement and signal-analysis techniques. Among the first ones, Minimum Entropy Deconvolution (MED) has been pointed out as a key tool able to highlight the effect of a possible damage in one of the bearing components within the vibration signal. This paper presents the application of this technique to signals collected on a simple test-rig, able to test damaged industrial roller bearings in different working conditions. The effectiveness of the technique has been tested, comparing the results of one undamaged bearing with three bearings artificially damaged in different locations, namely on the inner race, outer race and rollers. Since MED performances are dependent on the filter length, the most suitable value of this parameter is defined on the basis of both the application and measured signals. This represents an original contribution of the paper.