977 resultados para Appropriate Selection Processes Are Available For Choosing Hospitality Texts


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research in the area of geopolymer is gaining momentum during the past 20 years. Studies confirm that geopolymer concrete has good compressive strength, tensile strength, flexural strength, modulus of elasticity and durability. These properties are comparable with OPC concrete.There are many occasions where concrete is exposed to elevated temperatures like fire exposure from thermal processor, exposure from furnaces, nuclear exposure, etc.. In such cases, understanding of the behaviour of concrete and structural members exposed to elevated temperatures is vital. Even though many research reports are available about the behaviour of OPC concrete at elevated temperatures, there is limited information available about the behaviour of geopolymer concrete after exposure to elevated temperatures. A preliminary study was carried out for the selection of a mix proportion. The important variable considered in the present study include alkali/fly ash ratio, percentage of total aggregate content, fine aggregate to total aggregate ratio, molarity of sodium hydroxide, sodium silicate to sodium hydroxide ratio, curing temperature and curing period. Influence of different variables on engineering properties of geopolymer concrete was investigated. The study on interface shear strength of reinforced and unreinforced geopolymer concrete as well as OPC concrete was also carried out. Engineering properties of fly ash based geopolymer concrete after exposure to elevated temperatures (ambient to 800 °C) were studied and the corresponding results were compared with those of conventional concrete. Scanning Electron Microscope analysis, Fourier Transform Infrared analysis, X-ray powder Diffractometer analysis and Thermogravimetric analysis of geopolymer mortar or paste at ambient temperature and after exposure to elevated temperature were also carried out in the present research work. Experimental study was conducted on geopolymer concrete beams after exposure to elevated temperatures (ambient to 800 °C). Load deflection characteristics, ductility and moment-curvature behaviour of the geopolymer concrete beams after exposure to elevated temperatures were investigated. Based on the present study, major conclusions derived could be summarized as follows. There is a definite proportion for various ingredients to achieve maximum strength properties. Geopolymer concrete with total aggregate content of 70% by volume, ratio of fine aggregate to total aggregate of 0.35, NaOH molarity 10, Na2SiO3/NaOH ratio of 2.5 and alkali to fly ash ratio of 0.55 gave maximum compressive strength in the present study. An early strength development in geopolymer concrete could be achieved by the proper selection of curing temperature and the period of curing. With 24 hours of curing at 100 °C, 96.4% of the 28th day cube compressive strength could be achieved in 7 days in the present study. The interface shear strength of geopolymer concrete is lower to that of OPC concrete. Compared to OPC concrete, a reduction in the interface shear strength by 33% and 29% was observed for unreinforced and reinforced geopolymer specimens respectively. The interface shear strength of geopolymer concrete is lower than ordinary Portland cement concrete. The interface shear strength of geopolymer concrete can be approximately estimated as 50% of the value obtained based on the available equations for the calculation of interface shear strength of ordinary portland cement concrete (method used in Mattock and ACI). Fly ash based geopolymer concrete undergoes a high rate of strength loss (compressive strength, tensile strength and modulus of elasticity) during its early heating period (up to 200 °C) compared to OPC concrete. At a temperature exposure beyond 600 °C, the unreacted crystalline materials in geopolymer concrete get transformed into amorphous state and undergo polymerization. As a result, there is no further strength loss (compressive strength, tensile strength and modulus of elasticity) in geopolymer concrete, whereas, OPC concrete continues to lose its strength properties at a faster rate beyond a temperature exposure of 600 °C. At present no equation is available to predict the strength properties of geopolymer concrete after exposure to elevated temperatures. Based on the study carried out, new equations have been proposed to predict the residual strengths (cube compressive strength, split tensile strength and modulus of elasticity) of geopolymer concrete after exposure to elevated temperatures (upto 800 °C). These equations could be used for material modelling until better refined equations are available. Compared to OPC concrete, geopolymer concrete shows better resistance against surface cracking when exposed to elevated temperatures. In the present study, while OPC concrete started developing cracks at 400 °C, geopolymer concrete did not show any visible cracks up to 600 °C and developed only minor cracks at an exposure temperatureof 800 °C. Geopolymer concrete beams develop crack at an early load stages if they are exposed to elevated temperatures. Even though the material strength of the geopolymer concrete does not decrease beyond 600 °C, the flexural strength of corresponding beam reduces rapidly after 600 °C temperature exposure, primarily due to the rapid loss of the strength of steel. With increase in temperature, the curvature at yield point of geopolymer concrete beam increases and thereby the ductility reduces. In the present study, compared to the ductility at ambient temperature, the ductility of geopolymer concrete beams reduces by 63.8% at 800 °C temperature exposure. Appropriate equations have been proposed to predict the service load crack width of geopolymer concrete beam exposed to elevated temperatures. These equations could be used to limit the service load on geopolymer concrete beams exposed to elevated temperatures (up to 800 °C) for a predefined crack width (between 0.1mm and 0.3 mm) or vice versa. The moment-curvature relationship of geopolymer concrete beams at ambient temperature is similar to that of RCC beams and this could be predicted using strain compatibility approach Once exposed to an elevated temperature, the strain compatibility approach underestimates the curvature of geopolymer concrete beams between the first cracking and yielding point.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Dissertation befasst sich mit der Einführung komplexer Softwaresysteme, die, bestehend aus einer Kombination aus parametrisierter Standardsoftware gepaart mit Wettbewerbsvorteil sichernden Individualsoftwarekomponenten, keine Software-Engineering-Projekte im klassischen Sinn mehr darstellen, sondern einer strategieorientierten Gestaltung von Geschäftsprozessen und deren Implementierung in Softwaresystemen bedürfen. Die Problemstellung einer adäquaten Abwägung zwischen TCO-optimierender Einführung und einer gleichzeitigen vollständigen Unterstützung der kritischen Erfolgsfaktoren des Unternehmens ist hierbei von besonderer Bedeutung. Der Einsatz integrierter betriebswirtschaftlicher Standardsoftware, mit den Möglichkeiten einer TCO-Senkung, jedoch ebenfalls der Gefahr eines Verlustes von Alleinstellungsmerkmalen am Markt durch Vereinheitlichungstendenzen, stellt ein in Einführungsprojekten wesentliches zu lösendes Problem dar, um Suboptima zu vermeiden. Die Verwendung von Vorgehensmodellen, die sich oftmals an klassischen Softwareentwicklungsprojekten orientieren oder vereinfachte Phasenmodelle für das Projektmanagement darstellen, bedingt eine fehlende Situationsadäquanz in den Detailsituationen der Teilprojekte eines komplexen Einführungsprojektes. Das in dieser Arbeit entwickelte generische Vorgehensmodell zur strategieorientierten und partizipativen Einführung komplexer Softwaresysteme im betriebswirtschaftlichen Anwendungsbereich macht - aufgrund der besonders herausgearbeiteten Ansätze zu einer strategieorientierten Einführung, respektive Entwicklung derartiger Systeme sowie aufgrund der situationsadäquaten Vorgehensstrategien im Rahmen der Teilprojektorganisation � ein Softwareeinführungsprojekt zu einem Wettbewerbsfaktor stärkenden, strategischen Element im Unternehmen. Die in der Dissertation diskutierten Überlegungen lassen eine Vorgehensweise präferieren, die eine enge Verschmelzung des Projektes zur Organisationsoptimierung mit dem Softwareimplementierungsprozess impliziert. Eine Priorisierung der Geschäftsprozesse mit dem Ziel, zum einen bei Prozessen mit hoher wettbewerbsseitiger Priorität ein organisatorisches Suboptimum zu vermeiden und zum anderen trotzdem den organisatorischen Gestaltungs- und den Systemimplementierungsprozess schnell und ressourcenschonend durchzuführen, ist ein wesentliches Ergebnis der Ausarbeitungen. Zusätzlich führt die Ausgrenzung weiterer Prozesse vom Einführungsvorgang zunächst zu einem Produktivsystem, welches das Unternehmen in den wesentlichen Punkten abdeckt, das aber ebenso in späteren Projektschritten zu einem System erweitert werden kann, welches eine umfassende Funktionalität besitzt. Hieraus ergeben sich Möglichkeiten, strategischen Anforderungen an ein modernes Informationssystem, das die kritischen Erfolgsfaktoren eines Unternehmens konsequent unterstützen muss, gerecht zu werden und gleichzeitig ein so weit als möglich ressourcenschonendes, weil die Kostenreduktionsaspekte einer Standardlösung nutzend, Projekt durchzuführen. Ein weiterer wesentlicher Aspekt ist die situationsadäquate Modellinstanziierung, also die projektspezifische Anpassung des Vorgehensmodells sowie die situationsadäquate Wahl der Vorgehensweisen in Teilprojekten und dadurch Nutzung der Vorteile der verschiedenen Vorgehensstrategien beim konkreten Projektmanagement. Der Notwendigkeit der Entwicklung einer Projektorganisation für prototypingorientiertes Vorgehen wird in diesem Zusammenhang ebenfalls Rechnung getragen. Die Notwendigkeit der Unternehmen, sich einerseits mit starken Differenzierungspotenzialen am Markt hervorzuheben und andererseits bei ständig sinkenden Margen einer Kostenoptimierung nachzukommen, lässt auch in Zukunft das entwickelte Modell als erfolgreich erscheinen. Hinzu kommt die Tendenz zu Best-Of-Breed-Ansätzen und komponentenbasierten Systemen im Rahmen der Softwareauswahl, die eine ausgesprochen differenzierte Vorgehensweise in Projekten verstärkt notwendig machen wird. Durch die in das entwickelte Modell integrierten Prototyping-Ansätze wird der auch in Zukunft an Bedeutung gewinnenden Notwendigkeit der Anwenderintegration Rechnung getragen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During recent years, quantum information processing and the study of N−qubit quantum systems have attracted a lot of interest, both in theory and experiment. Apart from the promise of performing efficient quantum information protocols, such as quantum key distribution, teleportation or quantum computation, however, these investigations also revealed a great deal of difficulties which still need to be resolved in practise. Quantum information protocols rely on the application of unitary and non–unitary quantum operations that act on a given set of quantum mechanical two-state systems (qubits) to form (entangled) states, in which the information is encoded. The overall system of qubits is often referred to as a quantum register. Today the entanglement in a quantum register is known as the key resource for many protocols of quantum computation and quantum information theory. However, despite the successful demonstration of several protocols, such as teleportation or quantum key distribution, there are still many open questions of how entanglement affects the efficiency of quantum algorithms or how it can be protected against noisy environments. To facilitate the simulation of such N−qubit quantum systems and the analysis of their entanglement properties, we have developed the Feynman program. The program package provides all necessary tools in order to define and to deal with quantum registers, quantum gates and quantum operations. Using an interactive and easily extendible design within the framework of the computer algebra system Maple, the Feynman program is a powerful toolbox not only for teaching the basic and more advanced concepts of quantum information but also for studying their physical realization in the future. To this end, the Feynman program implements a selection of algebraic separability criteria for bipartite and multipartite mixed states as well as the most frequently used entanglement measures from the literature. Additionally, the program supports the work with quantum operations and their associated (Jamiolkowski) dual states. Based on the implementation of several popular decoherence models, we provide tools especially for the quantitative analysis of quantum operations. As an application of the developed tools we further present two case studies in which the entanglement of two atomic processes is investigated. In particular, we have studied the change of the electron-ion spin entanglement in atomic photoionization and the photon-photon polarization entanglement in the two-photon decay of hydrogen. The results show that both processes are, in principle, suitable for the creation and control of entanglement. Apart from process-specific parameters like initial atom polarization, it is mainly the process geometry which offers a simple and effective instrument to adjust the final state entanglement. Finally, for the case of the two-photon decay of hydrogenlike systems, we study the difference between nonlocal quantum correlations, as given by the violation of the Bell inequality and the concurrence as a true entanglement measure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web services from different partners can be combined to applications that realize a more complex business goal. Such applications built as Web service compositions define how interactions between Web services take place in order to implement the business logic. Web service compositions not only have to provide the desired functionality but also have to comply with certain Quality of Service (QoS) levels. Maximizing the users' satisfaction, also reflected as Quality of Experience (QoE), is a primary goal to be achieved in a Service-Oriented Architecture (SOA). Unfortunately, in a dynamic environment like SOA unforeseen situations might appear like services not being available or not responding in the desired time frame. In such situations, appropriate actions need to be triggered in order to avoid the violation of QoS and QoE constraints. In this thesis, proper solutions are developed to manage Web services and Web service compositions with regard to QoS and QoE requirements. The Business Process Rules Language (BPRules) was developed to manage Web service compositions when undesired QoS or QoE values are detected. BPRules provides a rich set of management actions that may be triggered for controlling the service composition and for improving its quality behavior. Regarding the quality properties, BPRules allows to distinguish between the QoS values as they are promised by the service providers, QoE values that were assigned by end-users, the monitored QoS as measured by our BPR framework, and the predicted QoS and QoE values. BPRules facilitates the specification of certain user groups characterized by different context properties and allows triggering a personalized, context-aware service selection tailored for the specified user groups. In a service market where a multitude of services with the same functionality and different quality values are available, the right services need to be selected for realizing the service composition. We developed new and efficient heuristic algorithms that are applied to choose high quality services for the composition. BPRules offers the possibility to integrate multiple service selection algorithms. The selection algorithms are applicable also for non-linear objective functions and constraints. The BPR framework includes new approaches for context-aware service selection and quality property predictions. We consider the location information of users and services as context dimension for the prediction of response time and throughput. The BPR framework combines all new features and contributions to a comprehensive management solution. Furthermore, it facilitates flexible monitoring of QoS properties without having to modify the description of the service composition. We show how the different modules of the BPR framework work together in order to execute the management rules. We evaluate how our selection algorithms outperform a genetic algorithm from related research. The evaluation reveals how context data can be used for a personalized prediction of response time and throughput.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El treball desenvolupat en aquesta tesi presenta un profund estudi i proveïx solucions innovadores en el camp dels sistemes recomanadors. Els mètodes que usen aquests sistemes per a realitzar les recomanacions, mètodes com el Filtrat Basat en Continguts (FBC), el Filtrat Col·laboratiu (FC) i el Filtrat Basat en Coneixement (FBC), requereixen informació dels usuaris per a predir les preferències per certs productes. Aquesta informació pot ser demogràfica (Gènere, edat, adreça, etc), o avaluacions donades sobre algun producte que van comprar en el passat o informació sobre els seus interessos. Existeixen dues formes d'obtenir aquesta informació: els usuaris ofereixen explícitament aquesta informació o el sistema pot adquirir la informació implícita disponible en les transaccions o historial de recerca dels usuaris. Per exemple, el sistema recomanador de pel·lícules MovieLens (http://movielens.umn.edu/login) demana als usuaris que avaluïn almenys 15 pel·lícules dintre d'una escala de * a * * * * * (horrible, ...., ha de ser vista). El sistema genera recomanacions sobre la base d'aquestes avaluacions. Quan els usuaris no estan registrat en el sistema i aquest no té informació d'ells, alguns sistemes realitzen les recomanacions tenint en compte l'historial de navegació. Amazon.com (http://www.amazon.com) realitza les recomanacions tenint en compte les recerques que un usuari a fet o recomana el producte més venut. No obstant això, aquests sistemes pateixen de certa falta d'informació. Aquest problema és generalment resolt amb l'adquisició d'informació addicional, se li pregunta als usuaris sobre els seus interessos o es cerca aquesta informació en fonts addicionals. La solució proposada en aquesta tesi és buscar aquesta informació en diverses fonts, específicament aquelles que contenen informació implícita sobre les preferències dels usuaris. Aquestes fonts poden ser estructurades com les bases de dades amb informació de compres o poden ser no estructurades com les pàgines web on els usuaris deixen la seva opinió sobre algun producte que van comprar o posseïxen. Nosaltres trobem tres problemes fonamentals per a aconseguir aquest objectiu: 1 . La identificació de fonts amb informació idònia per als sistemes recomanadors. 2 . La definició de criteris que permetin la comparança i selecció de les fonts més idònies. 3 . La recuperació d'informació de fonts no estructurades. En aquest sentit, en la tesi proposada s'ha desenvolupat: 1 . Una metodologia que permet la identificació i selecció de les fonts més idònies. Criteris basats en les característiques de les fonts i una mesura de confiança han estat utilitzats per a resoldre el problema de la identificació i selecció de les fonts. 2 . Un mecanisme per a recuperar la informació no estructurada dels usuaris disponible en la web. Tècniques de Text Mining i ontologies s'han utilitzat per a extreure informació i estructurar-la apropiadament perquè la utilitzin els recomanadors. Les contribucions del treball desenvolupat en aquesta tesi doctoral són: 1. Definició d'un conjunt de característiques per a classificar fonts rellevants per als sistemes recomanadors 2. Desenvolupament d'una mesura de rellevància de les fonts calculada sobre la base de les característiques definides 3. Aplicació d'una mesura de confiança per a obtenir les fonts més fiables. La confiança es definida des de la perspectiva de millora de la recomanació, una font fiable és aquella que permet millorar les recomanacions. 4. Desenvolupament d'un algorisme per a seleccionar, des d'un conjunt de fonts possibles, les més rellevants i fiable utilitzant les mitjanes esmentades en els punts previs. 5. Definició d'una ontologia per a estructurar la informació sobre les preferències dels usuaris que estan disponibles en Internet. 6. Creació d'un procés de mapatge que extreu automàticament informació de les preferències dels usuaris disponibles en la web i posa aquesta informació dintre de l'ontologia. Aquestes contribucions permeten aconseguir dos objectius importants: 1 . Millorament de les recomanacions usant fonts d'informació alternatives que sigui rellevants i fiables. 2 . Obtenir informació implícita dels usuaris disponible en Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many different individuals, who have their own expertise and criteria for decision making, are involved in making decisions on construction projects. Decision-making processes are thus significantly affected by communication, in which a dynamic performance of human intentions leads to unpredictable outcomes. In order to theorise the decision making processes including communication, it is argued here that the decision making processes resemble evolutionary dynamics in terms of both selection and mutation, which can be expressed by the replicator-mutator equation. To support this argument, a mathematical model of decision making has been made from an analogy with evolutionary dynamics, in which there are three variables: initial support rate, business hierarchy, and power of persuasion. On the other hand, a survey of patterns in decision making in construction projects has also been performed through self-administered mail questionnaire to construction practitioners. Consequently, comparison between the numerical analysis of mathematical model and the statistical analysis of empirical data has shown a significant potential of the replicator-mutator equation as a tool to study dynamic properties of intentions in communication.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the last 2 decades, the public and private sectors have made substantial international research progress toward improving the nutritional value of a wide range of food and feed crops. Nevertheless, significant numbers of people still suffer from the effects of undernutrition. In addition, the nutritional quality of feed is often a limiting factor in livestock production systems, particularly those in developing countries. As newly developed crops with nutritionally improved traits come closer to being available to producers and consumers, we must ensure that scientifically sound and efficient processes are used to assess the safety and nutritional quality of these crops. Such processes will facilitate deploying these crops to those world areas with large numbers of people who need them. This document describes 5 case studies of crops with improved nutritional value. These case studies examine the principles and recommendations published by the Intl. Life Sciences Inst. (ILSI) in 2004 for the safety and nutritional assessment of foods and feeds derived from nutritionally improved crops (ILSI 2004). One overarching conclusion that spans all 5 case studies is that the comparative safety assessment process is a valid approach. Such a process has been endorsed by many publications and organizations, including the 2004 ILSI publication. The type and extent of data that are appropriate for a scientifically sound comparative safety assessment are presented on a case-by-case basis in a manner that takes into account scientific results published since the 2004 ILSI report. This report will appear in the January issue of Comprehensive Reviews in Food Science and Food Safety.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two-component systems capable of self-assembling into soft gel-phase materials are of considerable interest due to their tunability and versatility. This paper investigates two-component gels based on a combination of a L-lysine-based dendron and a rigid diamine spacer (1,4-diaminobenzene or 1,4-diaminocyclohexane). The networked gelator was investigated using thermal measurements, circular dichroism, NMR spectroscopy and small angle neutron scattering (SANS) giving insight into the macroscopic properties, nanostructure and molecular-scale organisation. Surprisingly, all of these techniques confirmed that irrespective of the molar ratio of the components employed, the "solid-like" gel network always consisted of a 1:1 mixture of dendron/diamine. Additionally, the gel network was able to tolerate a significant excess of diamine in the "liquid-like" phase before being disrupted. In the light of this observation, we investigated the ability of the gel network structure to evolve from mixtures of different aromatic diamines present in excess. We found that these two-component gels assembled in a component-selective manner, with the dendron preferentially recognising 1,4-diaminobenzene (>70%). when similar competitor diamines (1,2- and 1,3-diaminobenzene) are present. Furthermore, NMR relaxation measurements demonstrated that the gel based oil 1,4-diaminobenzene was better able to form a selective ternary complex with pyrene than the gel based oil 1,4-diaminocyclohexane, indicative of controlled and selective pi-pi interactions within a three-component assembly. As such, the results ill this paper demonstrate how component selection processes in two-component gel systems call control hierarchical self-assembly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The electrochemistry of Pt nanostructured electrodes is investigated using hydrodynamic modulated voltammetry (HMV). Here a liquid crystal templating process is used to produce platinum-modified electrodes with a range of surface areas (roughness factor 42.4-280.8). The electroreduction of molecular oxygen at these nanostructured platinum surfaces is used to demonstrate the ability of HMV to discriminate between faradaic and nonfaradaic electrode reactions. The HMV approach shows that the reduction of molecular oxygen experiences considerable signal loss within the high pseudocapacitive region of the voltammetry. Evidence for the contribution of the double layer to transient mass transfer events is presented. In addition, a model circuit and appropriate theoretical analysis are used to illustrate the transient responses of a time variant faradaic component. This in conjunction with the experimental evidence shows that, far from being a passive component in this system, the double layer can contribute to HMV faradaic reactions under certain conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The transcriptome of an organism is its set of gene transcripts (mRNAs) at a defined spatial and temporal locus. Because gene expression is affected markedly by environmental and developmental perturbations, it is widely assumed that transcriptome divergence among taxa represents adaptive phenotypic selection. This assumption has been challenged by neutral theories which propose that stochastic processes drive transcriptome evolution. To test for evidence of neutral transcriptome evolution in plants, we quantified 18 494 gene transcripts in nonsenescent leaves of 14 taxa of Brassicaceae using robust cross-species transcriptomics which includes a two-step physical and in silico-based normalization procedure based on DNA similarity among taxa. Transcriptome divergence correlates positively with evolutionary distance between taxa and with variation in gene expression among samples. Results are similar for pseudogenes and chloroplast genes evolving at different rates. Remarkably, variation in transcript abundance among root-cell samples correlates positively with transcriptome divergence among root tissues and among taxa. Because neutral processes affect transcriptome evolution in plants, many differences in gene expression among or within taxa may be nonfunctional, reflecting ancestral plasticity and founder effects. Appropriate null models are required when comparing transcriptomes in space and time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many construction professionals and policy-makers would agree that client expectations should be accommodated during a building project. However, this aspiration is not easy to deal with as there may be conflicting interests within a client organization and these may change over time in the course of a project. This research asks why some client interests, and not others, are incorporated into the development of a building project. Actor-Network Theory (ANT) is used to study a single building project on a University campus. The building project is analysed as a number of discussions and negotiations, in which actors persuade each other to choose one solution over another. The analysis traces dynamic client engagement in decision-making processes as available options became increasingly constrained. However, this relative loss of control was countered by clients who continued the control over the timing of participants' involvement, and thus the way to impose their interests even at the later stage of the project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microarray data classification is one of the most important emerging clinical applications in the medical community. Machine learning algorithms are most frequently used to complete this task. We selected one of the state-of-the-art kernel-based algorithms, the support vector machine (SVM), to classify microarray data. As a large number of kernels are available, a significant research question is what is the best kernel for patient diagnosis based on microarray data classification using SVM? We first suggest three solutions based on data visualization and quantitative measures. Different types of microarray problems then test the proposed solutions. Finally, we found that the rule-based approach is most useful for automatic kernel selection for SVM to classify microarray data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the advantages of using the World Wide Web (Web) as a resource to teach hearing primary aged children Australian Sign Language (Auslan). There is a trend towards educating signing deaf children in mainstream schools, therefore it is important to teach the hearing children sign language to enable meaningful communication and the formation of social relationships between hearing and deaf students. The authors will compare various methods of teaching sign language with the Web and further describe a selection of the available instructional material. Considerations for designing appropriate sign language teaching material for the Web are discussed particularly in the context of designing content that engages the primary school aged audience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Personal identification of individuals is becoming increasingly adopted in society today. Due to the large number of electronic systems that require human identification, faster and more secure identification systems are pursued. Biometrics is based upon the physical characteristics of individuals; of these the fingerprint is the most common as used within law enforcement. Fingerprint-based systems have been introduced into the society but have not been well received due to relatively high rejection rates and false acceptance rates. This limited acceptance of fingerprint identification systems requires new techniques to be investigated to improve this identification method and the acceptance of the technology within society. Electronic fingerprint identification provides a method of identifying an individual within seconds quickly and easily. The fingerprint must be captured instantly to allow the system to identify the individual without any technical user interaction to simplify system operation. The performance of the entire system relies heavily on the quality of the original fingerprint image that is captured digitally. A single fingerprint scan for verification makes it easier for users accessing the system as it replaces the need to remember passwords or authorisation codes. The identification system comprises of several components to perform this function, which includes a fingerprint sensor, processor, feature extraction and verification algorithms. A compact texture feature extraction method will be implemented within an embedded microprocessor-based system for security, performance and cost effective production over currently available commercial fingerprint identification systems. To perform these functions various software packages are available for developing programs for windows-based operating systems but must not constrain to a graphical user interface alone. MATLAB was the software package chosen for this thesis due to its strong mathematical library, data analysis and image analysis libraries and capability. MATLAB enables the complete fingerprint identification system to be developed and implemented within a PC environment and also to be exported at a later date directly to an embedded processing environment. The nucleus of the fingerprint identification system is the feature extraction approach presented in this thesis that uses global texture information unlike traditional local information in minutiae-based identification methods. Commercial solid-state sensors such as the type selected for use in this thesis have a limited contact area with the fingertip and therefore only sample a limited portion of the fingerprint. This limits the number of minutiae that can be extracted from the fingerprint and as such limits the number of common singular points between two impressions of the same fingerprint. The application of texture feature extraction will be tested using variety of fingerprint images to determine the most appropriate format for use within the embedded system. This thesis has focused on designing a fingerprint-based identification system that is highly expandable using the MATLAB environment. The main components that are defined within this thesis are the hardware design, image capture, image processing and feature extraction methods. Selection of the final system components for this electronic fingerprint identification system was determined by using specific criteria to yield the highest performance from an embedded processing environment. These platforms are very cost effective and will allow fingerprint-based identification technology to be implemented in more commercial products that can benefit from the security and simplicity of a fingerprint identification system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ASA theory is one of the most important explanations of behaviour in organisations. Developed by Professor Ben Schneider, it is the idea that organisations contain similar types of people because they attract, select and retain people similar to those already employed by the organisation. This homogeneity explains why organisations are different to each other. Although a lot is known about attrition, little is known about the attraction and selection phases. This book contains a series of empirical studies that explore whether organisations attract and select people who hold the values of the people already employed by the organisation. The results of these studies cast doubt on how universal ASA theory might be and suggest that the initial employment decisions that people make are more about choosing their vocation than their employer.