962 resultados para Maximum independent set
Resumo:
In Service-Oriented Architectures (SOAs), software systems are decomposed into independent units, namely services, that interact with one another through message exchanges. To promote reuse and evolvability, these interactions are explicitly described right from the early phases of the development lifecycle. Up to now, emphasis has been placed on capturing structural aspects of service interactions. Gradually though, the description of behavioral dependencies between service interactions is gaining increasing attention as a means to push forward the SOA vision. This paper deals with the description of these behavioral dependencies during the analysis and design phases. The paper outlines a set of requirements that a language for modeling service interactions at this level should fulfill, and proposes a language whose design is driven by these requirements.
Resumo:
here/there/then/now was a practice-led research project that brought together 10 independent artists in dance, music, theatre and visual/media arts to create a site-specific program within the walls of the Brisbane Powerhouse. The purpose was to explore how to best conceive flexible performance platforms, theatricalise site-specific work and engage new audiences through forms of promenade experience that could provide open choices on how and where to view it. The sold out season of 6 performances, which took place 14-19 May 2002, presented three discrete performance installations set in intimate parts of the building, each with their own aesthetic and communicative intention, culminating in a fourth in-theatre installation, where memories of the first three coalesced and were reinterrogated. Each site thereby investigated meaning-making via the moving body and its critical relationship with space and objects, in a dramatic re-contextualisation of traditional solo dance forms, now re-articulated through interdisciplinary practices. The benefit of this approach was the creation of a layered and multimodal experience that could be both shared and subsequently critiqued by performers and audience alike.
Resumo:
Most departmental computing infrastructure reflects the state of networking technology and available funds at the time of construction, which converge in a preconceived notion of homogeneity of network architecture and usage patterns. The DMAN (Digital Media Access Network) project, a large-scale server and network foundation for the Hong Kong Polytechnic University's School of Design was created as a platform that would support a highly complex academic environment while giving maximum freedom to students, faculty and researchers through simplicity and ease of use. As a centralized multi-user computation backbone, DMAN faces an extremely hetrogeneous user and application profile, exceeding implementation and maintenance challenges of typical enterprise, and even most academic server set-ups. This paper sumarizes the specification, implementation and application of the system while describing its significance for design education in a computational context.
Resumo:
The rising problems associated with construction such as decreasing quality and productivity, labour shortages, occupational safety, and inferior working conditions have opened the possibility of more revolutionary solutions within the industry. One prospective option is in the implementation of innovative technologies such as automation and robotics, which has the potential to improve the industry in terms of productivity, safety and quality. The construction work site could, theoretically, be contained in a safer environment, with more efficient execution of the work, greater consistency of the outcome and higher level of control over the production process. By identifying the barriers to construction automation and robotics implementation in construction, and investigating ways in which to overcome them, contributions could be made in terms of better understanding and facilitating, where relevant, greater use of these technologies in the construction industry so as to promote its efficiency. This research aims to ascertain and explain the barriers to construction automation and robotics implementation by exploring and establishing the relationship between characteristics of the construction industry and attributes of existing construction automation and robotics technologies to level of usage and implementation in three selected countries; Japan, Australia and Malaysia. These three countries were chosen as their construction industry characteristics provide contrast in terms of culture, gross domestic product, technology application, organisational structure and labour policies. This research uses a mixed method approach of gathering data, both quantitative and qualitative, by employing a questionnaire survey and an interview schedule; using a wide range of sample from management through to on-site users, working in a range of small (less than AUD0.2million) to large companies (more than AUD500million), and involved in a broad range of business types and construction sectors. Detailed quantitative (statistical) and qualitative (content) data analysis is performed to provide a set of descriptions, relationships, and differences. The statistical tests selected for use include cross-tabulations, bivariate and multivariate analysis for investigating possible relationships between variables; and Kruskal-Wallis and Mann Whitney U test of independent samples for hypothesis testing and inferring the research sample to the construction industry population. Findings and conclusions arising from the research work which include the ranking schemes produced for four key areas of, the construction attributes on level of usage; barrier variables; differing levels of usage between countries; and future trends, have established a number of potential areas that could impact the level of implementation both globally and for individual countries.
Resumo:
CFO and I/Q mismatch could cause significant performance degradation to OFDM systems. Their estimation and compensation are generally difficult as they are entangled in the received signal. In this paper, we propose some low-complexity estimation and compensation schemes in the receiver, which are robust to various CFO and I/Q mismatch values although the performance is slightly degraded for very small CFO. These schemes consist of three steps: forming a cosine estimator free of I/Q mismatch interference, estimating I/Q mismatch using the estimated cosine value, and forming a sine estimator using samples after I/Q mismatch compensation. These estimators are based on the perception that an estimate of cosine serves much better as the basis for I/Q mismatch estimation than the estimate of CFO derived from the cosine function. Simulation results show that the proposed schemes can improve system performance significantly, and they are robust to CFO and I/Q mismatch.
Resumo:
Some Engineering Faculties are turning to the problem-based learning (PBL)paradigm to engender necessary skills and competence in their graduates. Since, at the same time, some Faculties are moving towards distance education, questions are being asked about the effectiveness of PBL for technical fields such as Engineering when delivered in virtual space. This paper outlines an investigation of how student attributes affect their learning experience in PBL courses offered in virtual space. A frequency distribution was superimposed on the outcome space of a phenomenographical study on a suitable PBL course to investigate the effect of different student attributes on the learning experience. It was discovered that the quality, quantity, and style of facilitator interaction had the greatest impact on the student learning experience. This highlights the need to establish consistent student interaction plans and to set, and ensure compliance with, minimum standards with respect to facilitation and student interactions.
Resumo:
President’s Message Hello fellow AITPM members, Well I can’t believe it’s already October! My office is already organising its end of year function and looking to plan for 2010. Our whole School is moving to a different building next year, with the lovely L block eventually making way for a new shiny one. Those of you who have entered the Brisbane CBD from the south side, across the Captain Cook Bridge, would know L block as the big 9 storey brick and concrete Lego block ode to 1970’s functional architecture, which greets you on the right hand side. Onto traffic matters: an issue that has been tossing around in my mind of late is that of speed. I know I am growing older and may be prematurely becoming a “grumpy old man”, but everyone around me locally seems to be accelerating off from the stop line much faster than I was taught to for economical driving, both here and in the United States (yes they made my wife and me resit our written and practical driving tests when we lived there). People here in Australia also seem to be driving right on top of the posted speed limit, on whichever part of the Road Hierarchy, whether urban or rural. I was also taught on both sides of the planet that the posted speed limit is a maximum legal speed, not the recommended driving speed. This message did seem to sink in to the American drivers around me when we lived in Oregon - where people did appear to drive more cautiously. Further, posted speed limits in Oregon were, and I presume still are, set more conservative by about 5mph or 10km/h than Australian limits, for any given part of the Road Hierarchy. Another excellent speed limit treatment used in Oregon was in school zones, where reduced speed limits applied “when children are present” rather than during prescribed hours on school days. This would be especially useful here in Australia, where a lot of extra-curricular activities take place around schools outside of the prescribed speed limit hours. Before and after hours school care is on the increase (with parents dropping and collecting children near dawn and dusk in the winter), and many childcentred land uses are located adjacent to schools, such as Scouts/Guides halls, swimming pools and parks. Consequentially, I believe there needs to be some consideration towards more public campaigning about economical driving and the real purpose of the speed limit = or perhaps even a rethink of the speed limit concept, if people really are driving on top of it and it’s not just me becoming grumpier (our industrial psychology friends at the research centres may be able to assist us here). The Queensland organising committee is now in full swing organising the 2010 AITPM National Conference, What’s New?, so please keep a lookout for related content. Best regards to all, Jon Bunker PS A Cartoonists view of traffic engineers I thought you might enjoy this. http://xkcd.com/277/
Resumo:
Chromatographic fingerprints of 46 Eucommia Bark samples were obtained by liquid chromatography-diode array detector (LC-DAD). These samples were collected from eight provinces in China, with different geographical locations, and climates. Seven common LC peaks that could be used for fingerprinting this common popular traditional Chinese medicine were found, and six were identified as substituted resinols (4 compounds), geniposidic acid and chlorogenic acid by LC-MS. Principal components analysis (PCA) indicated that samples from the Sichuan, Hubei, Shanxi and Anhui—the SHSA provinces, clustered together. The other objects from the four provinces, Guizhou, Jiangxi, Gansu and Henan, were discriminated and widely scattered on the biplot in four province clusters. The SHSA provinces are geographically close together while the others are spread out. Thus, such results suggested that the composition of the Eucommia Bark samples was dependent on their geographic location and environment. In general, the basis for discrimination on the PCA biplot from the original 46 objects× 7 variables data matrix was the same as that for the SHSA subset (36 × 7 matrix). The seven marker compound loading vectors grouped into three sets: (1) three closely correlating substituted resinol compounds and chlorogenic acid; (2) the fourth resinol compound identified by the OCH3 substituent in the R4 position, and an unknown compound; and (3) the geniposidic acid, which was independent of the set 1 variables, and which negatively correlated with the set 2 ones above. These observations from the PCA biplot were supported by hierarchical cluster analysis, and indicated that Eucommia Bark preparations may be successfully compared with the use of the HPLC responses from the seven marker compounds and chemometric methods such as PCA and the complementary hierarchical cluster analysis (HCA).
Resumo:
Examined whether discrete working memory deficits underlie positive, negative and disorganised symptoms of schizophrenia. 52 outpatients (mean age 37.5 yrs) with schizophrenia were studied using items drawn from the Positive and Negative Syndrome Scale (PANSS). Linear regression and correlational analyses were conducted to examine whether symptom dimension scores were related to performance on several tests of working memory function. Severity of negative symptoms correlated with reduced production of words during a verbal fluency task, impaired ability to hold letter and number sequences on-line and manipulate them simultaneously, reduced performance during a dual task, and compromised visuospatial working memory under distraction-free conditions. Severity of disorganisation symptoms correlated with impaired visuospatial working memory under conditions of distraction, failure of inhibition during a verbal fluency task, perseverative responding on a test of set-shifting ability, and impaired ability to judge the veracity of simple declarative statements. The present study provides evidence that the positive, negative and disorganised symptom dimensions of the PANSS constitute independent clusters, associated with unique patterns of working memory impairment.
Resumo:
An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).
Resumo:
The Node-based Local Mesh Generation (NLMG) algorithm, which is free of mesh inconsistency, is one of core algorithms in the Node-based Local Finite Element Method (NLFEM) to achieve the seamless link between mesh generation and stiffness matrix calculation, and the seamless link helps to improve the parallel efficiency of FEM. Furthermore, the key to ensure the efficiency and reliability of NLMG is to determine the candidate satellite-node set of a central node quickly and accurately. This paper develops a Fast Local Search Method based on Uniform Bucket (FLSMUB) and a Fast Local Search Method based on Multilayer Bucket (FLSMMB), and applies them successfully to the decisive problems, i.e. presenting the candidate satellite-node set of any central node in NLMG algorithm. Using FLSMUB or FLSMMB, the NLMG algorithm becomes a practical tool to reduce the parallel computation cost of FEM. Parallel numerical experiments validate that either FLSMUB or FLSMMB is fast, reliable and efficient for their suitable problems and that they are especially effective for computing the large-scale parallel problems.
Resumo:
A data-driven background dataset refinement technique was recently proposed for SVM based speaker verification. This method selects a refined SVM background dataset from a set of candidate impostor examples after individually ranking examples by their relevance. This paper extends this technique to the refinement of the T-norm dataset for SVM-based speaker verification. The independent refinement of the background and T-norm datasets provides a means of investigating the sensitivity of SVM-based speaker verification performance to the selection of each of these datasets. Using refined datasets provided improvements of 13% in min. DCF and 9% in EER over the full set of impostor examples on the 2006 SRE corpus with the majority of these gains due to refinement of the T-norm dataset. Similar trends were observed for the unseen data of the NIST 2008 SRE.
Resumo:
This thesis details methodology to estimate urban stormwater quality based on a set of easy to measure physico-chemical parameters. These parameters can be used as surrogate parameters to estimate other key water quality parameters. The key pollutants considered in this study are nitrogen compounds, phosphorus compounds and solids. The use of surrogate parameter relationships to evaluate urban stormwater quality will reduce the cost of monitoring and so that scientists will have added capability to generate a large amount of data for more rigorous analysis of key urban stormwater quality processes, namely, pollutant build-up and wash-off. This in turn will assist in the development of more stringent stormwater quality mitigation strategies. The research methodology was based on a series of field investigations, laboratory testing and data analysis. Field investigations were conducted to collect pollutant build-up and wash-off samples from residential roads and roof surfaces. Past research has identified that these impervious surfaces are the primary pollutant sources to urban stormwater runoff. A specially designed vacuum system and rainfall simulator were used in the collection of pollutant build-up and wash-off samples. The collected samples were tested for a range of physico-chemical parameters. Data analysis was conducted using both univariate and multivariate data analysis techniques. Analysis of build-up samples showed that pollutant loads accumulated on road surfaces are higher compared to the pollutant loads on roof surfaces. Furthermore, it was found that the fraction of solids smaller than 150 ìm is the most polluted particle size fraction in solids build-up on both roads and roof surfaces. The analysis of wash-off data confirmed that the simulated wash-off process adopted for this research agrees well with the general understanding of the wash-off process on urban impervious surfaces. The observed pollutant concentrations in wash-off from road surfaces were different to pollutant concentrations in wash-off from roof surfaces. Therefore, firstly, the identification of surrogate parameters was undertaken separately for roads and roof surfaces. Secondly, a common set of surrogate parameter relationships were identified for both surfaces together to evaluate urban stormwater quality. Surrogate parameters were identified for nitrogen, phosphorus and solids separately. Electrical conductivity (EC), total organic carbon (TOC), dissolved organic carbon (DOC), total suspended solids (TSS), total dissolved solids (TDS), total solids (TS) and turbidity (TTU) were selected as the relatively easy to measure parameters. Consequently, surrogate parameters for nitrogen and phosphorus were identified from the set of easy to measure parameters for both road surfaces and roof surfaces. Additionally, surrogate parameters for TSS, TDS and TS which are key indicators of solids were obtained from EC and TTU which can be direct field measurements. The regression relationships which were developed for surrogate parameters and key parameter of interest were of a similar format for road and roof surfaces, namely it was in the form of simple linear regression equations. The identified relationships for road surfaces were DTN-TDS:DOC, TP-TS:TOC, TSS-TTU, TDS-EC and TSTTU: EC. The identified relationships for roof surfaces were DTN-TDS and TSTTU: EC. Some of the relationships developed had a higher confidence interval whilst others had a relatively low confidence interval. The relationships obtained for DTN-TDS, DTN-DOC, TP-TS and TS-EC for road surfaces demonstrated good near site portability potential. Currently, best management practices are focussed on providing treatment measures for stormwater runoff at catchment outlets where separation of road and roof runoff is not found. In this context, it is important to find a common set of surrogate parameter relationships for road surfaces and roof surfaces to evaluate urban stormwater quality. Consequently DTN-TDS, TS-EC and TS-TTU relationships were identified as the common relationships which are capable of providing measurements of DTN and TS irrespective of the surface type.