876 resultados para Multi objective optimizations (MOO)
Resumo:
One of the ways in which university departments and faculties can enhance the quality of learning and assessment is to develop a ‘well thought out criterion‐referenced assessment system’ (Biggs, 2003, p. 271). In designing undergraduate degrees (courses) this entails making decisions about the levelling of expectations across different years through devising objectives and their corresponding criteria and standards: a process of alignment analogous to what happens in unit (subject) design. These decisions about levelling have important repercussions in terms of supporting students’ work‐related learning, especially in relation to their ability to cope with the increasing cognitive and skill demands made on them as they progress through their studies. They also affect the accountability of teacher judgments of students’ responses to assessment tasks, achievement of unit objectives and, ultimately, whether students are awarded their degrees and are sufficiently prepared for the world of work. Research reveals that this decision‐making process is rarely underpinned by an explicit educational rationale (Morgan et al, 2002). The decision to implement criterion referenced assessment in an undergraduate microbiology degree was the impetus for developing such a rationale because of the implications for alignment, and therefore ‘levelling’ of expectations across different years of the degree. This paper provides supporting evidence for a multi‐pronged approach to levelling, through backward mapping of two revised units (foundation and exit year). This approach adheres to the principles of alignment while combining a work‐related approach (via industry input) with the blended disciplinary and learner‐centred approaches proposed by Morgan et al. (2002). It is suggested that this multi‐pronged approach has the potential for making expectations, especially work‐related ones across different year levels of degrees, more explicit to students and future employers.
Resumo:
Classical negotiation models are weak in supporting real-world business negotiations because these models often assume that the preference information of each negotiator is made public. Although parametric learning methods have been proposed for acquiring the preference information of negotiation opponents, these methods suffer from the strong assumptions about the specific utility function and negotiation mechanism employed by the opponents. Consequently, it is difficult to apply these learning methods to the heterogeneous negotiation agents participating in e‑marketplaces. This paper illustrates the design, development, and evaluation of a nonparametric negotiation knowledge discovery method which is underpinned by the well-known Bayesian learning paradigm. According to our empirical testing, the novel knowledge discovery method can speed up the negotiation processes while maintaining negotiation effectiveness. To the best of our knowledge, this is the first nonparametric negotiation knowledge discovery method developed and evaluated in the context of multi-issue bargaining over e‑marketplaces.
Resumo:
This article describes the development and validation of a multi-dimensional scale for measuring managers’ perceptions of the range of factors that routinely guide their decision-making processes. An instrument for identifying managerial ethical profiles (MEP) is developed by measuring the perceived role of different ethical principles in the decision-making of managers. Evidence as to the validity of the multidimensionality of the ethical scale is provided, based on the comparative assessment of different models for managerial ethical decision-making. Confirmatory Factor Analysis (CFA) supported a eight-factor model including two factors for each of the main four schools of moral philosophy. Future research needs and the value of this measure to business ethics are discussed.
Resumo:
This paper reports the application of multicriteria decision making techniques, PROMETHEE and GAIA, and receptor models, PCA/APCS and PMF, to data from an air monitoring site located on the campus of Queensland University of Technology in Brisbane, Australia and operated by Queensland Environmental Protection Agency (QEPA). The data consisted of the concentrations of 21 chemical species and meteorological data collected between 1995 and 2003. PROMETHEE/GAIA separated the samples into those collected when leaded and unleaded petrol were used to power vehicles in the region. The number and source profiles of the factors obtained from PCA/APCS and PMF analyses were compared. There are noticeable differences in the outcomes possibly because of the non-negative constraints imposed on the PMF analysis. While PCA/APCS identified 6 sources, PMF reduced the data to 9 factors. Each factor had distinctive compositions that suggested that motor vehicle emissions, controlled burning of forests, secondary sulphate, sea salt and road dust/soil were the most important sources of fine particulate matter at the site. The most plausible locations of the sources were identified by combining the results obtained from the receptor models with meteorological data. The study demonstrated the potential benefits of combining results from multi-criteria decision making analysis with those from receptor models in order to gain insights into information that could enhance the development of air pollution control measures.
Resumo:
The aetiology of secondary lymphoedema seems to be multifactorial, with acquired abnormalities as well as pre-existing conditions being contributory factors. Many characteristics bear inconsistent relationships to lymphoedema risk, and the few that are consistently associated with an increased risk of developing the condition, do not alone distinguish the at-risk population. Further, our current prevention and management recommendations are not backed by strong evidence. Consequently, there remains much to be learned about who gets it, how can it be prevented and how can we best treat it. Nonetheless, it is clear that lymphoedema is associated with adverse side effects, which have a profound impact on daily life, and that preliminary evidence suggests that early detection may lead to more effective treatment and lack of treatment may lead to progression. These represent important reasons as to why lymphoedema deserves clinical attention. However, several pragmatic issues must be considered when discussing whether a routine objective measure of lymphoedema could be integrated among the standard clinical care of those undertaking treatment for cancers known to be associated with the development of lymphoedema.
Resumo:
Public private partnerships (PPP) have been widely used as a method for public infrastructure project delivery not only locally and internationally, however the adoption of PPPs in social infrastructure procurement has still been very limited. The objective of this paper is to investigate the potential of implementation of current PPP framework in social affordable housing projects in South East Queensland. Data were collected from 22 interviewees with rich experiences in the industry. The findings of this study show that affordable housing investment have been considered by the industry practitioners as a risky business in comparison to other private rental housing investment. The main determents of the adoption of PPPs in social infrastructure project are the tenant-related factors, such as the inability of paying rent and the inability of caring the property. The study also suggests the importance of seeking strategic partnership with community-based organisation that has experiences in managing similar tenants’ profiles. Current PPP guideline is also viewed as inappropriate for the affordable housing projects, but the principle of VFM framework and risk allocation in PPPs still be applied to the affordable housing projects. This study helps to understand the viability of PPP in social housing procurement projects, and point out the importance of developing guideline for multi-stakeholder partnership and the expansion of the current VFM and PPPs guidelines.
Resumo:
Tzeng et al. proposed a new threshold multi-proxy multi-signature scheme with threshold verification. In their scheme, a subset of original signers authenticates a designated proxy group to sign on behalf of the original group. A message m has to be signed by a subset of proxy signers who can represent the proxy group. Then, the proxy signature is sent to the verifier group. A subset of verifiers in the verifier group can also represent the group to authenticate the proxy signature. Subsequently, there are two improved schemes to eliminate the security leak of Tzeng et al.’s scheme. In this paper, we have pointed out the security leakage of the three schemes and further proposed a novel threshold multi-proxy multi-signature scheme with threshold verification.
Resumo:
Association rule mining is one technique that is widely used when querying databases, especially those that are transactional, in order to obtain useful associations or correlations among sets of items. Much work has been done focusing on efficiency, effectiveness and redundancy. There has also been a focusing on the quality of rules from single level datasets with many interestingness measures proposed. However, with multi-level datasets now being common there is a lack of interestingness measures developed for multi-level and cross-level rules. Single level measures do not take into account the hierarchy found in a multi-level dataset. This leaves the Support-Confidence approach,which does not consider the hierarchy anyway and has other drawbacks, as one of the few measures available. In this paper we propose two approaches which measure multi-level association rules to help evaluate their interestingness. These measures of diversity and peculiarity can be used to help identify those rules from multi-level datasets that are potentially useful.
Resumo:
Association rule mining has made many advances in the area of knowledge discovery. However, the quality of the discovered association rules is a big concern and has drawn more and more attention recently. One problem with the quality of the discovered association rules is the huge size of the extracted rule set. Often for a dataset, a huge number of rules can be extracted, but many of them can be redundant to other rules and thus useless in practice. Mining non-redundant rules is a promising approach to solve this problem. In this paper, we firstly propose a definition for redundancy; then we propose a concise representation called Reliable basis for representing non-redundant association rules for both exact rules and approximate rules. An important contribution of this paper is that we propose to use the certainty factor as the criteria to measure the strength of the discovered association rules. With the criteria, we can determine the boundary between redundancy and non-redundancy to ensure eliminating as many redundant rules as possible without reducing the inference capacity of and the belief to the remaining extracted non-redundant rules. We prove that the redundancy elimination based on the proposed Reliable basis does not reduce the belief to the extracted rules. We also prove that all association rules can be deduced from the Reliable basis. Therefore the Reliable basis is a lossless representation of association rules. Experimental results show that the proposed Reliable basis can significantly reduce the number of extracted rules.
Resumo:
Economists rely heavily on self-reported measures to examine the relationship between income and health. We directly compare survey responses of a self-reported measure of health that is commonly used in nationally representative surveys with objective measures of the same health condition. We focus on hypertension. We find no evidence of an income/health greadient using self-reported hypertension but a sizeable gradient when using objectively measured hypertension. We also find that the probability of a false negative reporting is significantly income graded. Our results suggest that using commonly available self-reported chronic health measures might underestimate true income-related inequalities in health.
Resumo:
The study reported here, constitutes a full review of the major geological events that have influenced the morphological development of the southeast Queensland region. Most importantly, it provides evidence that the region’s physiography continues to be geologically ‘active’ and although earthquakes are presently few and of low magnitude, many past events and tectonic regimes continue to be strongly influential over drainage, morphology and topography. Southeast Queensland is typified by highland terrain of metasedimentary and igneous rocks that are parallel and close to younger, lowland coastal terrain. The region is currently situated in a passive margin tectonic setting that is now under compressive stress, although in the past, the region was subject to alternating extensional and compressive regimes. As part of the investigation, the effects of many past geological events upon landscape morphology have been assessed at multiple scales using features such as the location and orientation of drainage channels, topography, faults, fractures, scarps, cleavage, volcanic centres and deposits, and recent earthquake activity. A number of hypotheses for local geological evolution are proposed and discussed. This study has also utilised a geographic information system (GIS) approach that successfully amalgamates the various types and scales of datasets used. A new method of stream ordination has been developed and is used to compare the orientation of channels of similar orders with rock fabric, in a topologically controlled approach that other ordering systems are unable to achieve. Stream pattern analysis has been performed and the results provide evidence that many drainage systems in southeast Queensland are controlled by known geological structures and by past geological events. The results conclude that drainage at a fine scale is controlled by cleavage, joints and faults, and at a broader scale, large river valleys, such as those of the Brisbane River and North Pine River, closely follow the location of faults. These rivers appear to have become entrenched by differential weathering along these planes of weakness. Significantly, stream pattern analysis has also identified some ‘anomalous’ drainage that suggests the orientations of these watercourses are geologically controlled, but by unknown causes. To the north of Brisbane, a ‘coastal drainage divide’ has been recognized and is described here. The divide crosses several lithological units of different age, continues parallel to the coast and prevents drainage from the highlands flowing directly to the coast for its entire length. Diversion of low order streams away from the divide may be evidence that a more recent process may be the driving force. Although there is no conclusive evidence for this at present, it is postulated that the divide may have been generated by uplift or doming associated with mid-Cenozoic volcanism or a blind thrust at depth. Also north of Brisbane, on the D’Aguilar Range, an elevated valley (the ‘Kilcoy Gap’) has been identified that may have once drained towards the coast and now displays reversed drainage that may have resulted from uplift along the coastal drainage divide and of the D’Aguilar blocks. An assessment of the distribution and intensity of recent earthquakes in the region indicates that activity may be associated with ancient faults. However, recent movement on these faults during these events would have been unlikely, given that earthquakes in the region are characteristically of low magnitude. There is, however, evidence that compressive stress is building and being released periodically and ancient faults may be a likely place for this stress to be released. The relationship between ancient fault systems and the Tweed Shield Volcano has also been discussed and it is suggested here that the volcanic activity was associated with renewed faulting on the Great Moreton Fault System during the Cenozoic. The geomorphology and drainage patterns of southeast Queensland have been compared with expected morphological characteristics found at passive and other tectonic settings, both in Australia and globally. Of note are the comparisons with the East Brazilian Highlands, the Gulf of Mexico and the Blue Ridge Escarpment, for example. In conclusion, the results of the study clearly show that, although the region is described as a passive margin, its complex, past geological history and present compressive stress regime provide a more intricate and varied landscape than would be expected along typical passive continental margins. The literature review provides background to the subject and discusses previous work and methods, whilst the findings are presented in three peer-reviewed, published papers. The methods, hypotheses, suggestions and evidence are discussed at length in the final chapter.
Resumo:
We consider a new form of authenticated key exchange which we call multi-factor password-authenticated key exchange, where session establishment depends on successful authentication of multiple short secrets that are complementary in nature, such as a long-term password and a one-time response, allowing the client and server to be mutually assured of each other's identity without directly disclosing private information to the other party. Multi-factor authentication can provide an enhanced level of assurance in higher-security scenarios such as online banking, virtual private network access, and physical access because a multi-factor protocol is designed to remain secure even if all but one of the factors has been compromised. We introduce a security model for multi-factor password-authenticated key exchange protocols, propose an efficient and secure protocol called MFPAK, and provide a security argument to show that our protocol is secure in this model. Our security model is an extension of the Bellare-Pointcheval-Rogaway security model for password-authenticated key exchange and accommodates an arbitrary number of symmetric and asymmetric authentication factors.
Resumo:
This paper studies receiver autonomous integrity monitoring (RAIM) algorithms and performance benefits of RTK solutions with multiple-constellations. The proposed method is generally known as Multi-constellation RAIM -McRAIM. The McRAIM algorithms take advantage of the ambiguity invariant character to assist fast identification of multiple satellite faults in the context of multiple constellations, and then detect faulty satellites in the follow-up ambiguity search and position estimation processes. The concept of Virtual Galileo Constellation (VGC) is used to generate useful data sets of dual-constellations for performance analysis. Experimental results from a 24-h data set demonstrate that with GPS&VGC constellations, McRAIM can significantly enhance the detection and exclusion probabilities of two simultaneous faulty satellites in RTK solutions.
Resumo:
Multi-output boost (MOB) converter is a novel DC-DC converter unlike the regular boost converter, has the ability to share its total output voltage and to have different series output voltage from a given duty cycle for low and high power applications. In this paper, discrete voltage control with inner hysteresis current control loop has been proposed to keep the simplicity of the control law for the double-output MOB converter, which can be implemented by a combination of analogue and logical ICs or simple microcontroller to constrain the output voltages of MOB converter at their reference voltages against variation in load or input voltage. The salient features of the proposed control strategy are simplicity of implementation and ease to extend to multiple outputs in the MOB converter. Simulation and experimental results are presented to show the validity of control strategy.