897 resultados para Multicommodity network design problem
Resumo:
In this paper, we propose a new load distribution strategy called `send-and-receive' for scheduling divisible loads, in a linear network of processors with communication delay. This strategy is designed to optimally utilize the network resources and thereby minimizes the processing time of entire processing load. A closed-form expression for optimal size of load fractions and processing time are derived when the processing load originates at processor located in boundary and interior of the network. A condition on processor and link speed is also derived to ensure that the processors are continuously engaged in load distributions. This paper also presents a parallel implementation of `digital watermarking problem' on a personal computer-based Pentium Linear Network (PLN) topology. Experiments are carried out to study the performance of the proposed strategy and results are compared with other strategies found in literature.
Resumo:
Using the imagination during the design process is a critical part of how designers design, using it in the synthesis phase to generate ideas and find creative solutions to a given problem. However, what designers imagine - see in the mind’s eye - during the design process is a complex and difficult to articulate phenomenon, which, until recently, has been not been greatly understood or articulated. This early study reports on an education context where exercises were integrated into undergraduate design studies aimed to enhance the imagining process. Outcomes suggest that exercising the imagination in this context assists future designers to become more skilled in design synthesis practices which explore various temporal, existential and physical qualities in future spaces, as well as be able to articulate the seemingly ‘mysterious’ aspects of the design process.
Resumo:
This paper proposes and explores the Deep Customer Insight Innovation Framework in order to develop an understanding as to how design can be integrated within existing innovation processes. The Deep Customer Insight Innovation Framework synthesises the work of Beckman and Barry (2007) as a theoretical foundation, with the framework explored within a case study of Australian Airport Corporation seeking to drive airport innovations in operations and retail performance. The integration of a deep customer insight approach develops customer-centric and highly integrated solutions as a function of concentrated problem exploration and design-led idea generation. Businesses’ facing complex innovation challenges or seeking to making sense of future opportunities will be able to integrate design into existing innovation processes, anchoring the new approach between existing market research and business development activities. This paper contributes a framework and novel understanding as to how design methods are integrated into existing innovation processes for operationalization within industry.
Resumo:
Background The Palliative Care Problem Severity Score is a clinician-rated tool to assess problem severity in four palliative care domains (pain, other symptoms, psychological/spiritual, family/carer problems) using a 4-point categorical scale (absent, mild, moderate, severe). Aim To test the reliability and acceptability of the Palliative Care Problem Severity Score. Design: Multi-centre, cross-sectional study involving pairs of clinicians independently rating problem severity using the tool. Setting/participants Clinicians from 10 Australian palliative care services: 9 inpatient units and 1 mixed inpatient/community-based service. Results A total of 102 clinicians participated, with almost 600 paired assessments completed for each domain, involving 420 patients. A total of 91% of paired assessments were undertaken within 2 h. Strength of agreement for three of the four domains was moderate: pain (Kappa = 0.42, 95% confidence interval = 0.36 to 0.49); psychological/spiritual (Kappa = 0.48, 95% confidence interval = 0.42 to 0.54); family/carer (Kappa = 0.45, 95% confidence interval = 0.40 to 0.52). Strength of agreement for the remaining domain (other symptoms) was fair (Kappa = 0.38, 95% confidence interval = 0.32 to 0.45). Conclusion The Palliative Care Problem Severity Score is an acceptable measure, with moderate reliability across three domains. Variability in inter-rater reliability across sites and participant feedback indicate that ongoing education is required to ensure that clinicians understand the purpose of the tool and each of its domains. Raters familiar with the patient they were assessing found it easier to assign problem severity, but this did not improve inter-rater reliability.
Resumo:
A general method for the preparation of novel disulfide-tethered macrocyclic diacylglycerols (DAGs) has been described. Overall synthesis involved stepwise protection, acylation, and deprotection to yield the bis(omega-bromoacyl) glycerols. In the crucial macrocyclization step, a unique reagent, benzyltriethylammonium tetrathiomolybdate (BTAT), has been used to convert individual bis(omega-bromoacyl) glycerols to their respective macrocyclic disulfides. DAG 6, which had ether linkages between hydrocarbon chains and the glycerol backbone, was also synthesized from an appropriate precursor using a similar protocol. One of the DAGs (DAG 5) had a carbon-carbon tether instead of a disulfide one and was synthesized using modified Glaser coupling. Preparation of alpha-disulfide-tethered DAG (DAG 4) required an alternative method, as treatment of the bisbromo precursor with BTAT gave a mixture of several compounds from which separation of the target molecule was cumbersome. To avoid this problem, the bisbromide was converted to its corresponding dithiocyanate, which on further treatment with BTAT yielded the desired DAG (DAG 4) in good yield. Upon treatment with the reducing agent dithiothreitol (DTT), the DAGs that contain a disulfide tether could be quantitatively converted to their "open-chain" thiol analogues. These macrocyclic DAGs and their reduced "open-chain" analogues have been incorporated in DPPC vesicles to study their effect on model membranes. Upon incorporation of DAG 1 in DPPC vesicles, formation of new isotropic phases was observed by P-31 NMR, These isotropic phases disappeared completely on opening the macrocyclic ring by a reducing agent. The thermotropic properties of DPPC bilayers having DAGs (1-6) incorporated at various concentrations were studied by differential scanning calorimetry. Incorporation of DAGs in general reduced the cooperativity unit (CU) of the vesicles. Similar experiments with reduced "open-chain" DAGs incorporated in a DPPC bilayer indicated a recovery of CU with respect to their macrocyclic "disulfide" counterparts. The effect of inclusion of these DAGs on the activity of phospholipase A(2) (PLA(2)) was studied in vitro. Incorporation of DAC 1 in DPPC membranes potentiated both bee venom and cobra venom PLA(2) activities.
Resumo:
This paper presents an off-line (finite time interval) and on-line learning direct adaptive neural controller for an unstable helicopter. The neural controller is designed to track pitch rate command signal generated using the reference model. A helicopter having a soft inplane four-bladed hingeless main rotor and a four-bladed tail rotor with conventional mechanical controls is used for the simulation studies. For the simulation study, a linearized helicopter model at different straight and level flight conditions is considered. A neural network with a linear filter architecture trained using backpropagation through time is used to approximate the control law. The controller network parameters are adapted using updated rules Lyapunov synthesis. The off-line trained (for finite time interval) network provides the necessary stability and tracking performance. The on-line learning is used to adapt the network under varying flight conditions. The on-line learning ability is demonstrated through parameter uncertainties. The performance of the proposed direct adaptive neural controller (DANC) is compared with feedback error learning neural controller (FENC).
Resumo:
PROBLEM Cost of delivering medium density apartments impedes supply of new and more affordable housing in established suburbs EXISTING FOCUS - Planning controls - Construction costs, esp labour - Regulation eg sustainability
Resumo:
Architects regularly employ design as a problem-solving tool in the built environment. Within the design process, architects apply design thinking to reframe problems as opportunities, take advantage of contradictory information to develop new solutions, and differentiate outcomes based on context. This research aims to investigate how design can be better positioned to develop greater differentiated value to an architect’s current service offering, and how design as a strategy could be applied as a driver of business innovation within the Australian architecture industry. The research will explore literature relating to the future of architecture, the application of design thinking, and the benefits of strategic design. The future intent of the research is to develop strategies that improve the value offering of architects, and develop design led solutions that could be applied successfully to the business of architecture.
Resumo:
Queensland University of Technology (QUT), School of Nursing (SoN), has offered a postgraduate Graduate Certificate in Emergency Nursing since 2003, for registered nurses practising in an emergency clinical area, who fulfil key entry criteria. Feedback from industry partners and students evidenced support for flexible and extended study pathways in emergency nursing. Therefore, in the context of a growing demand for emergency health services and the need for specialist qualified staff, it was timely to review and redevelop our emergency specialist nursing courses. The QUT postgraduate emergency nursing study area is supported by a course advisory group, whose aim is to provide input and focus development of current and future course planning. All members of the course advisory were invited to form an expert panel to review current emergency course documents. A half day “brainstorm session”, planning and development workshop was held to review the emergency courses to implement changes from 2009. Results from the expert panel planning day include: proposal for a new emergency specialty unit; incorporation of the College of Emergency Nurses (CENA) Standards for Emergency Nursing Specialist in clinical assessment; modification of the present core emergency unit; enhancing the focus of the two other units that emergency students undertake; and opening the emergency study area to the Graduate Diploma in Nursing (Emergency Nursing) and Master of Nursing (Emergency Nursing). The conclusion of the brainstorm session resulted in a clearer conceptualisation, of the study pathway for students. Overall, the expert panel group of enthusiastic emergency educators and clinicians provided viable options for extending the career progression opportunities for emergency nurses. In concluding, the opportunity for collaboration across university and clinical settings has resulted in the design of a course with exciting potential and strong clinical relevance.
Resumo:
Background: A genetic network can be represented as a directed graph in which a node corresponds to a gene and a directed edge specifies the direction of influence of one gene on another. The reconstruction of such networks from transcript profiling data remains an important yet challenging endeavor. A transcript profile specifies the abundances of many genes in a biological sample of interest. Prevailing strategies for learning the structure of a genetic network from high-dimensional transcript profiling data assume sparsity and linearity. Many methods consider relatively small directed graphs, inferring graphs with up to a few hundred nodes. This work examines large undirected graphs representations of genetic networks, graphs with many thousands of nodes where an undirected edge between two nodes does not indicate the direction of influence, and the problem of estimating the structure of such a sparse linear genetic network (SLGN) from transcript profiling data. Results: The structure learning task is cast as a sparse linear regression problem which is then posed as a LASSO (l1-constrained fitting) problem and solved finally by formulating a Linear Program (LP). A bound on the Generalization Error of this approach is given in terms of the Leave-One-Out Error. The accuracy and utility of LP-SLGNs is assessed quantitatively and qualitatively using simulated and real data. The Dialogue for Reverse Engineering Assessments and Methods (DREAM) initiative provides gold standard data sets and evaluation metrics that enable and facilitate the comparison of algorithms for deducing the structure of networks. The structures of LP-SLGNs estimated from the INSILICO1, INSILICO2 and INSILICO3 simulated DREAM2 data sets are comparable to those proposed by the first and/or second ranked teams in the DREAM2 competition. The structures of LP-SLGNs estimated from two published Saccharomyces cerevisae cell cycle transcript profiling data sets capture known regulatory associations. In each S. cerevisiae LP-SLGN, the number of nodes with a particular degree follows an approximate power law suggesting that its degree distributions is similar to that observed in real-world networks. Inspection of these LP-SLGNs suggests biological hypotheses amenable to experimental verification. Conclusion: A statistically robust and computationally efficient LP-based method for estimating the topology of a large sparse undirected graph from high-dimensional data yields representations of genetic networks that are biologically plausible and useful abstractions of the structures of real genetic networks. Analysis of the statistical and topological properties of learned LP-SLGNs may have practical value; for example, genes with high random walk betweenness, a measure of the centrality of a node in a graph, are good candidates for intervention studies and hence integrated computational – experimental investigations designed to infer more realistic and sophisticated probabilistic directed graphical model representations of genetic networks. The LP-based solutions of the sparse linear regression problem described here may provide a method for learning the structure of transcription factor networks from transcript profiling and transcription factor binding motif data.
Resumo:
This paper presents a power, latency and throughput trade-off study on NoCs by varying microarchitectural (e.g. pipelining) and circuit level (e.g. frequency and voltage) parameters. We change pipelining depth, operating frequency and supply voltage for 3 example NoCs - 16 node 2D Torus, Tree network and Reduced 2D Torus. We use an in-house NoC exploration framework capable of topology generation and comparison using parameterized models of Routers and links developed in SystemC. The framework utilizes interconnect power and delay models from a low-level modelling tool called Intacte[1]1. We find that increased pipelining can actually reduce latency. We also find that there exists an optimal degree of pipelining which is the most energy efficient in terms of minimizing energy-delay product.
Resumo:
The WiFiRe (WiFi Rural Extension) proposal for rural broadband access is being developed under the aegis of CEWIT. The system leverages the widely available, and highly cost-reduced, WiFi chipsets. However, only the physical layer from these chipsets is retained. A single base station carries several WiFi transceivers, each serving one sector of the cell, and all operating on the same WiFi channel in a time division duplex (TDD) manner. We replace the contention based WiFi MAC with a single-channel TDD multisector TDM MAC similar to the WiMax MAC. In this paper we discuss in detail the issues in designing such a MAC for the purpose of carrying packet voice telephony and for Internet access. The problem of determining the optimal spatial reuse is formulated and the optimal spatial reuse and the corresponding cell size is derived. Then the voice and data scheduler is designed. It is shown how throughput fairness can be implemented in the data scheduler. A capacity assessment of the system is also provided.
Resumo:
During the past few decades, developing efficient methods to solve dynamic facility layout problems has been focused on significantly by practitioners and researchers. More specifically meta-heuristic algorithms, especially genetic algorithm, have been proven to be increasingly helpful to generate sub-optimal solutions for large-scale dynamic facility layout problems. Nevertheless, the uncertainty of the manufacturing factors in addition to the scale of the layout problem calls for a mixed genetic algorithm–robust approach that could provide a single unlimited layout design. The present research aims to devise a customized permutation-based robust genetic algorithm in dynamic manufacturing environments that is expected to be generating a unique robust layout for all the manufacturing periods. The numerical outcomes of the proposed robust genetic algorithm indicate significant cost improvements compared to the conventional genetic algorithm methods and a selective number of other heuristic and meta-heuristic techniques.
Resumo:
DRAMATURGY OF THEATRE MANAGEMENT Essential tasks, everyday problems and the need for structural changes Theatre justifies its existence only through high quality performances. Maintaining the artistic level and organizing performances are the primary tasks of a manager, even though in everyday life this often seems to be overshadowed by all the other tasks of a manager s work. How does a theatre manager design strategies and make everyday decisions if aims are to have artistically meaningful performances, financial success and a socially healthy ensemble, when not only artistic work or leadership of an organization are to be taken into consideration, but also a manpower-based art institution with long traditions? What does theatre management consist of and what kind of dramaturgical movement happens in it? Based on interviews carried out in five different city theatres in Finland in the years 2004-2008, incident stories were written within a continuous comparison theory frame. Social constructionism within a dramaturgic framework enabled versatile dialog on a manager s work and problem areas. The result is an interpretative study, where instead of common regularities, many details are collected that can be taken into consideration when similar situations occur. Based on the interviews and historical data, four factors that influence a manager s work were chosen: ownership, media, work community and programme. Within theatre management, the central problems were 1) the inconsistent use of theatre resources and problems in corporate governance caused by the administrative models; 2) the theatre s image, based on the image of its manager, as presented by the media and its influence on the wellbeing of the staff; 3) unsolved problems between the staff left behind by the previous managers and problems related to casting; 4) knowledge of the audience. These points influence how the manager plans the artistic programme and divides the resources. The theatre manager s job description has remained quite the same since the early days of Kaarlo Bergbom. In the future, special attention should be placed on why managers face fairly similar problems decade after decade. Reducing these problems partly depends on whether structural improvements are made to a theatre s close network of owners, financers and labour unions. During this study clear evidence was seen that structural changes are necessary in the production of performances and in the creation of a more versatile programme. In this process, different kinds of co-operation, experiments, development projects, continuing education and international relations have special importance, especially if the aim is to make it possible for all citizens of Finland to enjoy a vibrant and revitalized theatre.
Resumo:
In this paper we study two problems in feedback stabilization. The first is the simultaneous stabilization problem, which can be stated as follows. Given plantsG_{0}, G_{1},..., G_{l}, does there exist a single compensatorCthat stabilizes all of them? The second is that of stabilization by a stable compensator, or more generally, a "least unstable" compensator. Given a plantG, we would like to know whether or not there exists a stable compensatorCthat stabilizesG; if not, what is the smallest number of right half-place poles (counted according to their McMillan degree) that any stabilizing compensator must have? We show that the two problems are equivalent in the following sense. The problem of simultaneously stabilizingl + 1plants can be reduced to the problem of simultaneously stabilizinglplants using a stable compensator, which in turn can be stated as the following purely algebraic problem. Given2lmatricesA_{1}, ..., A_{l}, B_{1}, ..., B_{l}, whereA_{i}, B_{i}are right-coprime for alli, does there exist a matrixMsuch thatA_{i} + MB_{i}, is unimodular for alli?Conversely, the problem of simultaneously stabilizinglplants using a stable compensator can be formulated as one of simultaneously stabilizingl + 1plants. The problem of determining whether or not there exists anMsuch thatA + BMis unimodular, given a right-coprime pair (A, B), turns out to be a special case of a question concerning a matrix division algorithm in a proper Euclidean domain. We give an answer to this question, and we believe this result might be of some independent interest. We show that, given twon times mplantsG_{0} and G_{1}we can generically stabilize them simultaneously provided eithernormis greater than one. In contrast, simultaneous stabilizability, of two single-input-single-output plants, g0and g1, is not generic.