928 resultados para Time Optimal
Resumo:
Collaborative networks have come to form a large part of the public sector’s strategy to address ongoing and often complex social problems. The relational power of networks, with its emphasis on trust, reciprocity and mutuality provides the mechanism to integrate previously dispersed and even competitive entities into a collective venture(Agranoff 2003; Agranoff and McGuire 2003; Mandell 1994; Mandell and Harrington 1999). It is argued that the refocusing of a single body of effort to a collective contributes to reducing duplication and overlap of services, maximizes increasingly scarce resources and contributes to solving intractable or 'wicked’problems (Clarke and Stewart 1997). Given the current proliferation of collaborative networks and the fact that they are likely to continue for some time, concerns with the management and leadership of such arrangements for optimal outcomes are increasingly relevant. This is especially important for public sector managers who are used to working in a top-down, hierarchical manner. While the management of networks (Agranoff and McGuire 2001, 2003), including collaborative or complex networks (Kickert et al. 1997; Koppenjan and Klijn 2004), has been the subject of considerable attention, there has been much less explicit discussion on leadership approaches in this context. It is argued in this chapter that the traditional use of the terms ‘leader’ or ‘leadership’ does not apply to collaborative networks. There are no ‘followers’ in collaborative networks or supervisor-subordinate relations. Instead there are equal, horizontal relationships that are focused on delivering systems change. In this way the emergent organizational forms such as collaborative networks challenge older models of leadership. However despite the questionable relevance of old leadership styles to the contemporary work environment, no clear alternative has come along to take its place.
Resumo:
Electronic Health Record (EHR) systems are being introduced to overcome the limitations associated with paper-based and isolated Electronic Medical Record (EMR) systems. This is accomplished by aggregating medical data and consolidating them in one digital repository. Though an EHR system provides obvious functional benefits, there is a growing concern about the privacy and reliability (trustworthiness) of Electronic Health Records. Security requirements such as confidentiality, integrity, and availability can be satisfied by traditional hard security mechanisms. However, measuring data trustworthiness from the perspective of data entry is an issue that cannot be solved with traditional mechanisms, especially since degrees of trust change over time. In this paper, we introduce a Time-variant Medical Data Trustworthiness (TMDT) assessment model to evaluate the trustworthiness of medical data by evaluating the trustworthiness of its sources, namely the healthcare organisation where the data was created and the medical practitioner who diagnosed the patient and authorised entry of this data into the patient’s medical record, with respect to a certain period of time. The result can then be used by the EHR system to manipulate health record metadata to alert medical practitioners relying on the information to possible reliability problems.
Resumo:
The Internet theoretically enables marketers to personalize a Website to an individual consumer. This article examines optimal Website design from the perspective of personality trait theory and resource-matching theory. The influence of two traits relevant to Internet Web-site processing—sensation seeking and need for cognition— were studied in the context of resource matching and different levels of Web-site complexity. Data were collected at two points of time: personality-trait data and a laboratory experiment using constructed Web sites. Results reveal that (a) subjects prefer Web sites of a medium level of complexity, rather than high or low complexity; (b)high sensation seekers prefer complex visual designs, and low sensation seekers simple visual designs, both in Web sites of medium complexity; and (c) high need-for-cognition subjects evaluated Web sites with high verbal and low visual complexity more favourably.
Resumo:
In this third Quantum Interaction (QI) meeting it is time to examine our failures. One of the weakest elements of QI as a field, arises in its continuing lack of models displaying proper evolutionary dynamics. This paper presents an overview of the modern generalised approach to the derivation of time evolution equations in physics, showing how the notion of symmetry is essential to the extraction of operators in quantum theory. The form that symmetry might take in non-physical models is explored, with a number of viable avenues identified.
Resumo:
Bone morphogenetic proteins (BMPs) have been widely investigated for their clinical use in bone repair and it is known that a suitable carrier matrix to deliver them is essential for optimal bone regeneration within a specific defect site. Fused deposited modeling (FDM) allows for the fabrication of medical grade poly 3-caprolactone/tricalcium phosphate (mPCL–TCP) scaffolds with high reproducibility and tailor designed dimensions. Here we loaded FDM fabricated mPCL–TCP/collagen scaffolds with 5 mg recombinant human (rh)BMP-2 and evaluated bone healing within a rat calvarial critical-sized defect. Using a comprehensive approach, this study assessed the newly regenerated bone employing microcomputed tomography (mCT), histology/histomorphometry, and mechanical assessments. By 15 weeks, mPCL–TCP/collagen/rhBMP-2 defects exhibited complete healing of the calvarium whereas the non- BMP-2-loaded scaffolds showed significant less bone ingrowth, as confirmed by mCT. Histomorphometry revealed significantly increased bone healing amongst the rhBMP-2 groups compared to non-treated scaffolds at 4 and 15 weeks, although the % BV/TV did not indicate complete mineralisation of the entire defect site. Hence, our study confirms that it is important to combine microCt and histomorphometry to be able to study bone regeneration comprehensively in 3D. A significant up-regulation of the osteogenic proteins, type I collagen and osteocalcin, was evident at both time points in rhBMP-2 groups. Although mineral apposition rates at 15 weeks were statistically equivalent amongst treatment groups, microcompression and push-out strengths indicated superior bone quality at 15 weeks for defects treated with mPCL–TCP/collagen/rhBMP-2. Consistently over all modalities, the progression of healing was from empty defect < mPCL–TCP/collagen < mPCL–TCP/collagen/rhBMP-2, providing substantiating data to support the hypothesis that the release of rhBMP-2 from FDM-created mPCL–TCP/collagen scaffolds is a clinically relevant approach to repair and regenerate critically-sized craniofacial bone defects. Crown Copyright 2008 Published by Elsevier Ltd. All rights reserved.
Resumo:
In Web service based systems, new value-added Web services can be constructed by integrating existing Web services. A Web service may have many implementations, which are functionally identical, but have different Quality of Service (QoS) attributes, such as response time, price, reputation, reliability, availability and so on. Thus, a significant research problem in Web service composition is how to select an implementation for each of the component Web services so that the overall QoS of the composite Web service is optimal. This is so called QoS-aware Web service composition problem. In some composite Web services there are some dependencies and conflicts between the Web service implementations. However, existing approaches cannot handle the constraints. This paper tackles the QoS-aware Web service composition problem with inter service dependencies and conflicts using a penalty-based genetic algorithm (GA). Experimental results demonstrate the effectiveness and the scalability of the penalty-based GA.
Resumo:
In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.
Resumo:
When complex projects go wrong they can go horribly wrong with severe financial consequences. We are undertaking research to develop leading performance indicators for complex projects, metrics to provide early warning of potential difficulties. The assessment of success of complex projects can be made by a range of stakeholders over different time scales, against different levels of project results: the project’s outputs at the end of the project; the project’s outcomes in the months following project completion; and the project’s impact in the years following completion. We aim to identify leading performance indicators, which may include both success criteria and success factors, and which can be measured by the project team during project delivery to forecast success as assessed by key stakeholders in the days, months and years following the project. The hope is the leading performance indicators will act as alarm bells to show if a project is diverting from plan so early corrective action can be taken. It may be that different combinations of the leading performance indicators will be appropriate depending on the nature of project complexity. In this paper we develop a new model of project success, whereby success is assessed by different stakeholders over different time frames against different levels of project results. We then relate this to measurements that can be taken during project delivery. A methodology is described to evaluate the early parts of this model. Its implications and limitations are described. This paper describes work in progress.
Resumo:
Purpose – The paper aims to describe a workforce-planning model developed in-house in an Australian university library that is based on rigorous environmental scanning of an institution, the profession and the sector. Design/methodology/approach – The paper uses a case study that describes the stages of the planning process undertaken to develop the Library’s Workforce Plan and the documentation produced. Findings – While it has been found that the process has had successful and productive outcomes, workforce planning is an ongoing process. To remain effective, the workforce plan needs to be reviewed annually in the context of the library’s overall planning program. This is imperative if the plan is to remain current and to be regarded as a living document that will continue to guide library practice. Research limitations/implications – Although a single case study, the work has been contextualized within the wider research into workforce planning. Practical implications – The paper provides a model that can easily be deployed within a library without external or specialist consultant skills, and due to its scalability can be applied at department or wider level. Originality/value – The paper identifies the trends impacting on, and the emerging opportunities for, university libraries and provides a model for workforce planning that recognizes the context and culture of the organization as key drivers in determining workforce planning. Keywords - Australia, University libraries, Academic libraries, Change management, Manpower planning Paper type - Case study
Resumo:
Cognitive-energetical theories of information processing were used to generate predictions regarding the relationship between workload and fatigue within and across consecutive days of work. Repeated measures were taken on board a naval vessel during a non-routine and a routine patrol. Data were analyzed using growth curve modeling. Fatigue demonstrated a non-monotonic relationship within days in both patrols – fatigue was high at midnight, started decreasing until noontime and then increased again. Fatigue increased across days towards the end of the non-routine patrol, but remained stable across days in the routine patrol. The relationship between workload and fatigue changed over consecutive days in the non-routine patrol. At the beginning of the patrol, low workload was associated with fatigue. At the end of the patrol, high workload was associated with fatigue. This relationship could not be tested in the routine patrol, however it demonstrated a non-monotonic relationship between workload and fatigue – low and high workloads were associated with the highest fatigue. These results suggest that the optimal level of workload can change over time and thus have implications for the management of fatigue.
Resumo:
This paper presents a model to estimate travel time using cumulative plots. Three different cases considered are i) case-Det, for only detector data; ii) case-DetSig, for detector data and signal controller data and iii) case-DetSigSFR: for detector data, signal controller data and saturation flow rate. The performance of the model for different detection intervals is evaluated. It is observed that detection interval is not critical if signal timings are available. Comparable accuracy can be obtained from larger detection interval with signal timings or from shorter detection interval without signal timings. The performance for case-DetSig and for case-DetSigSFR is consistent with accuracy generally more than 95% whereas, case-Det is highly sensitive to the signal phases in the detection interval and its performance is uncertain if detection interval is integral multiple of signal cycles.
Resumo:
To allocate and size capacitors in a distribution system, an optimization algorithm, called Discrete Particle Swarm Optimization (DPSO), is employed in this paper. The objective is to minimize the transmission line loss cost plus capacitors cost. During the optimization procedure, the bus voltage, the feeder current and the reactive power flowing back to the source side should be maintained within standard levels. To validate the proposed method, the semi-urban distribution system that is connected to bus 2 of the Roy Billinton Test System (RBTS) is used. This 37-bus distribution system has 22 loads being located in the secondary side of a distribution substation (33/11 kV). Reducing the transmission line loss in a standard system, in which the transmission line loss consists of only about 6.6 percent of total power, the capabilities of the proposed technique are seen to be validated.
Resumo:
Fatigue and overwork are problems experienced by numerous employees in many industry sectors. Focusing on improving work-life balance can frame the ‘problem’ of long work hours to resolve working time duration issues. Flexible work options through re-organising working time arrangements is key to developing an organisational response for delivering work-life balance and usually involves changing the internal structure of work time. This study examines the effect of compressed long weekly working hours and the consequent ‘long break’ on work-life balance. Using Spillover theory and Border theory, this research considers organisational and personal determinants of overwork and fatigue. It concludes compressed long work hours with a long break provide better work-life balance. Further, a long break allows gaining ‘personal time’ and overcoming fatigue.