915 resultados para Linear semi-infinite optimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The overall objective of this thesis is to explore how and why the content of individuals' psychological contracts changes over time. The contract is generally understood as "individual beliefs, shaped by the organisation, regarding the terms of an exchange agreement between individuals and their organisation" (Rousseau, 1995, p. 9). With an overall study sampling frame of 320 graduate organisational newcomers, a mixed method longitudinal research design comprised of three sequential, inter-related studies is employed in order to capture the change process. From the 15 semi-structured interviews conducted in Study 1, the key findings included identifying a relatively high degree of mutuality between employees' and their managers' reciprocal contract beliefs around the time of organisational entry. Also, at this time, individuals had developed specific components of their contract content through a mix of social network information (regarding broader employment expectations) and perceptions of various elements of their particular organisation's reputation (for more firm-specific expectations). Study 2 utilised a four-wave survey approach (available to the full sampling frame) over the 14 months following organisational entry to explore the 'shape' of individuals' contract change trajectories and the role of four theorised change predictors in driving these trajectories. The predictors represented an organisational-level informational cue (perceptions of corporate reputation), a dyadic-level informational cue (perceptions of manager-employee relationship quality) and two individual difference variables (affect and hardiness). Through the use of individual growth modelling, the findings showed differences in the general change patterns across contract content components of perceived employer (exhibiting generally quadratic change patterns) and employee (exhibiting generally no-change patterns) obligations. Further, individuals differentially used the predictor variables to construct beliefs about specific contract content. While both organisational- and dyadic-level cues were focused upon to construct employer obligation beliefs, organisational-level cues and individual difference variables were focused upon to construct employee obligation beliefs. Through undertaking 26 semi-structured interviews, Study 3 focused upon gaining a richer understanding of why participants' contracts changed, or otherwise, over the study period, with a particular focus upon the roles of breach and violation. Breach refers to an employee's perception that an employer obligation has not been met and violation refers to the negative and affective employee reactions which may ensue following a breach. The main contribution of these findings was identifying that subsequent to a breach or violation event a range of 'remediation effects' could be activated by employees which, depending upon their effectiveness, served to instigate either breach or contract repair or both. These effects mostly instigated broader contract repair and were generally cognitive strategies enacted by an individual to re-evaluate the breach situation and re-focus upon other positive aspects of the employment relationship. As such, the findings offered new evidence for a clear distinction between remedial effects which serve to only repair the breach (and thus the contract) and effects which only repair the contract more broadly; however, when effective, both resulted in individuals again viewing their employment relationships positively. Overall, in response to the overarching research question of this thesis, how and why individuals' psychological contract beliefs change, individuals do indeed draw upon various information sources, particularly at the organisational-level, as cues or guides in shaping their contract content. Further, the 'shapes' of the changes in beliefs about employer and employee obligations generally follow different, and not necessarily linear, trajectories over time. Finally, both breach and violation and also remedial actions, which address these occurrences either by remedying the breach itself (and thus the contract) or the contract only, play central roles in guiding individuals' contract changes to greater or lesser degrees. The findings from the thesis provide both academics and practitioners with greater insights into how employees construct their contract beliefs over time, the salient informational cues used to do this and how the effects of breach and violation can be mitigated through creating an environment which facilitates the use of effective remediation strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The growing demand of air-conditioning is one of the largest contributors to Australia’s overall electricity consumption. This has started to create peak load supply problems for some electricity utilities particularly in Queensland. This research aimed to develop consumer demand side response model to assist electricity consumers to mitigate peak demand on the electrical network. The model developed demand side response model to allow consumers to manage and control air conditioning for every period, it is called intelligent control. This research investigates optimal response of end-user toward electricity price for several cases in the near future, such as: no spike, spike and probability spike price cases. The results indicate the potential of the scheme to achieve energy savings, reducing electricity bills (costs) to the consumer and targeting best economic performance for electrical generation distribution and transmission.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The railhead is severely stressed under the localized wheel contact patch close to the gaps in insulated rail joints. A modified railhead profile in the vicinity of the gapped joint, through a shape optimization model based on a coupled genetic algorithm and finite element method, effectively alters the contact zone and reduces the railhead edge stress concentration significantly. Two optimization methods, a grid search method and a genetic algorithm, were employed for this optimization problem. The optimal results from these two methods are discussed and, in particular, their suitability for the rail end stress minimization problem is studied. Through several numerical examples, the optimal profile is shown to be unaffected by either the magnitude or the contact position of the loaded wheel. The numerical results are validated through a large-scale experimental study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantifying spatial and/or temporal trends in environmental modelling data requires that measurements be taken at multiple sites. The number of sites and duration of measurement at each site must be balanced against costs of equipment and availability of trained staff. The split panel design comprises short measurement campaigns at multiple locations and continuous monitoring at reference sites [2]. Here we present a modelling approach for a spatio-temporal model of ultrafine particle number concentration (PNC) recorded according to a split panel design. The model describes the temporal trends and background levels at each site. The data were measured as part of the “Ultrafine Particles from Transport Emissions and Child Health” (UPTECH) project which aims to link air quality measurements, child health outcomes and a questionnaire on the child’s history and demographics. The UPTECH project involves measuring aerosol and particle counts and local meteorology at each of 25 primary schools for two weeks and at three long term monitoring stations, and health outcomes for a cohort of students at each school [3].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present the outcomes of a project on the exploration of the use of Field Programmable Gate Arrays(FPGAs) as co-processors for scientific computation. We designed a custom circuit for the pipelined solving of multiple tri-diagonal linear systems. The design is well suited for applications that require many independent tri diagonal system solves, such as finite difference methods for solving PDEs or applications utilising cubic spline interpolation. The selected solver algorithm was the Tri Diagonal Matrix Algorithm (TDMA or Thomas Algorithm). Our solver supports user specified precision thought the use of a custom floating point VHDL library supporting addition, subtraction, multiplication and division. The variable precision TDMA solver was tested for correctness in simulation mode. The TDMA pipeline was tested successfully in hardware using a simplified solver model. The details of implementation, the limitations, and future work are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virus-like particle-based vaccines for high-risk human papillomaviruses (HPVs) appear to have great promise; however, cell culture-derived vaccines will probably be very expensive. The optimization of expression of different codon-optimized versions of the HPV-16 L1 capsid protein gene in plants has been explored by means of transient expression from a novel suite of Agrobacterium tumefaciens binary expression vectors, which allow targeting of recombinant protein to the cytoplasm, endoplasmic reticulum (ER) or chloroplasts. A gene resynthesized to reflect human codon usage expresses better than the native gene, which expresses better than a plant-optimized gene. Moreover, chloroplast localization allows significantly higher levels of accumulation of L1 protein than does cytoplasmic localization, whilst ER retention was least successful. High levels of L1 (>17% total soluble protein) could be produced via transient expression: the protein assembled into higher-order structures visible by electron microscopy, and a concentrated extract was highly immunogenic in mice after subcutaneous injection and elicited high-titre neutralizing antibodies. Transgenic tobacco plants expressing a human codon-optimized gene linked to a chloroplast-targeting signal expressed L1 at levels up to 11% of the total soluble protein. These are the highest levels of HPV L1 expression reported for plants: these results, and the excellent immunogenicity of the product, significantly improve the prospects of making a conventional HPV vaccine by this means. © 2007 SGM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A baculovirus-insect cell expression system potentially provides the means to produce prophylactic HIV-1 virus-like particle (VLP) vaccines inexpensively and in large quantities. However, the system must be optimized to maximize yields and increase process efficiency. In this study, we optimized the production of two novel, chimeric HIV-1 VLP vaccine candidates (GagRT and GagTN) in insect cells. This was done by monitoring the effects of four specific factors on VLP expression: these were insect cell line, cell density, multiplicity of infection (MOI), and infection time. The use of western blots, Gag p24 ELISA, and four-factorial ANOVA allowed the determination of the most favorable conditions for chimeric VLP production, as well as which factors affected VLP expression most significantly. Both VLP vaccine candidates favored similar optimal conditions, demonstrating higher yields of VLPs when produced in the Trichoplusia ni Pro insect cell line, at a cell density of 1 × 106 cells/mL, and an infection time of 96 h post infection. It was found that cell density and infection time were major influencing factors, but that MOI did not affect VLP expression significantly. This work provides a potentially valuable guideline for HIV-1 protein vaccine optimization, as well as for general optimization of a baculovirus-based expression system to produce complex recombinant proteins. © 2009 American Institute of Chemical Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vehicular Ad-hoc Networks (VANET) have different characteristics compared to other mobile ad-hoc networks. The dynamic nature of the vehicles which act as routers and clients are connected with unreliable radio links and Routing becomes a complex problem. First we propose CO-GPSR (Cooperative GPSR), an extension of the traditional GPSR (Greedy Perimeter Stateless Routing) which uses relay nodes which exploit radio path diversity in a vehicular network to increase routing performance. Next we formulate a Multi-objective decision making problem to select optimum packet relaying nodes to increase the routing performance further. We use cross layer information for the optimization process. We evaluate the routing performance more comprehensively using realistic vehicular traces and a Nakagami fading propagation model optimized for highway scenarios in VANETs. Our results show that when Multi-objective decision making is used for cross layer optimization of routing a 70% performance increment can be obtained for low vehicle densities on average, which is a two fold increase compared to the single criteria maximization approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An iterative based strategy is proposed for finding the optimal rating and location of fixed and switched capacitors in distribution networks. The substation Load Tap Changer tap is also set during this procedure. A Modified Discrete Particle Swarm Optimization is employed in the proposed strategy. The objective function is composed of the distribution line loss cost and the capacitors investment cost. The line loss is calculated using estimation of the load duration curve to multiple levels. The constraints are the bus voltage and the feeder current which should be maintained within their standard range. For validation of the proposed method, two case studies are tested. The first case study is the semi-urban 37-bus distribution system which is connected at bus 2 of the Roy Billinton Test System which is located in the secondary side of a 33/11 kV distribution substation. The second case is a 33 kV distribution network based on the modification of the 18-bus IEEE distribution system. The results are compared with prior publications to illustrate the accuracy of the proposed strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed Genetic Algorithms (DGAs) designed for the Internet have to take its high communication cost into consideration. For island model GAs, the migration topology has a major impact on DGA performance. This paper describes and evaluates an adaptive migration topology optimizer that keeps the communication load low while maintaining high solution quality. Experiments on benchmark problems show that the optimized topology outperforms static or random topologies of the same degree of connectivity. The applicability of the method on real-world problems is demonstrated on a hard optimization problem in VLSI design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past few years, remarkable progress has been made in unveiling novel and unique optical properties of strongly coupled plasmonic nanostructures. However, application of such plasmonic nanostructures in biomedicine remains challenging due to the lack of facile and robust assembly methods for producing stable nanostructures. Previous attempts to achieve plasmonic nano-assemblies using molecular ligands were limited due to the lack of flexibility that could be exercised in forming them. Here, we report the utilization of tailor-made hyperbranched polymers (HBP) as linkers to assemble gold nanoparticles (NPs) into nano-assemblies. The ease and flexibility in tuning the particle size and number of branch ends of a HBP makes it an ideal candidate as a linker, as opposed to DNA, small organic molecules and linear or dendrimeric polymers. We report a strong correlation of polymer (HBP) concentration with the size of the hybrid nano-assemblies and “hot-spot” density. We have shown that such solutions of stable HBP-gold nano-assemblies can be barcoded with various Raman tags to provide improved surface-enhanced Raman scattering (SERS) compared with non-aggregated NP systems. These Raman barcoded hybrid nano-assemblies, with further optimization of NP shape, size and “hot-spot” density, may find application as diagnostic tools in nanomedicine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we will discuss the issue of rostering jobs of cabin crew attendants at KLM. Generated schedules get easily disrupted by events such as illness of an employee. Obviously, reserve people have to be kept 'on duty' to resolve such disruptions. A lot of reserve crew requires more employees, but too few results in so-called secondary disruptions, which are particularly inconvenient for both the crew members and the planners. In this research we will discuss several modifications of the reserve scheduling policy that have a potential to reduce the number of secondary disruptions, and therefore to improve the performance of the scheduling process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present the outcomes of a project on the exploration of the use of Field Programmable Gate Arrays (FPGAs) as co-processors for scientific computation. We designed a custom circuit for the pipelined solving of multiple tri-diagonal linear systems. The design is well suited for applications that require many independent tri-diagonal system solves, such as finite difference methods for solving PDEs or applications utilising cubic spline interpolation. The selected solver algorithm was the Tri-Diagonal Matrix Algorithm (TDMA or Thomas Algorithm). Our solver supports user specified precision thought the use of a custom floating point VHDL library supporting addition, subtraction, multiplication and division. The variable precision TDMA solver was tested for correctness in simulation mode. The TDMA pipeline was tested successfully in hardware using a simplified solver model. The details of implementation, the limitations, and future work are also discussed.