983 resultados para self organising feature maps (SOFM or SOM)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper the Binary Search Tree Imposed Growing Self Organizing Map (BSTGSOM) is presented as an extended version of the Growing Self Organizing Map (GSOM), which has proven advantages in knowledge discovery applications. A Binary Search Tree imposed on the GSOM is mainly used to investigate the dynamic perspectives of the GSOM based on the inputs and these generated temporal patterns are stored to further analyze the behavior of the GSOM based on the input sequence. Also, the performance advantages are discussed and compared with that of the original GSOM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ln this work the implementation of the SOM (Self Organizing Maps) algorithm or Kohonen neural network is presented in the form of hierarchical structures, applied to the compression of images. The main objective of this approach is to develop an Hierarchical SOM algorithm with static structure and another one with dynamic structure to generate codebooks (books of codes) in the process of the image Vector Quantization (VQ), reducing the time of processing and obtaining a good rate of compression of images with a minimum degradation of the quality in relation to the original image. Both self-organizing neural networks developed here, were denominated HSOM, for static case, and DHSOM, for the dynamic case. ln the first form, the hierarchical structure is previously defined and in the later this structure grows in an automatic way in agreement with heuristic rules that explore the data of the training group without use of external parameters. For the network, the heuristic mIes determine the dynamics of growth, the pruning of ramifications criteria, the flexibility and the size of children maps. The LBO (Linde-Buzo-Oray) algorithm or K-means, one ofthe more used algorithms to develop codebook for Vector Quantization, was used together with the algorithm of Kohonen in its basic form, that is, not hierarchical, as a reference to compare the performance of the algorithms here proposed. A performance analysis between the two hierarchical structures is also accomplished in this work. The efficiency of the proposed processing is verified by the reduction in the complexity computational compared to the traditional algorithms, as well as, through the quantitative analysis of the images reconstructed in function of the parameters: (PSNR) peak signal-to-noise ratio and (MSE) medium squared error

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, new computers generation provides a high performance that enables to build computationally expensive computer vision applications applied to mobile robotics. Building a map of the environment is a common task of a robot and is an essential part to allow the robots to move through these environments. Traditionally, mobile robots used a combination of several sensors from different technologies. Lasers, sonars and contact sensors have been typically used in any mobile robotic architecture, however color cameras are an important sensor due to we want the robots to use the same information that humans to sense and move through the different environments. Color cameras are cheap and flexible but a lot of work need to be done to give robots enough visual understanding of the scenes. Computer vision algorithms are computational complex problems but nowadays robots have access to different and powerful architectures that can be used for mobile robotics purposes. The advent of low-cost RGB-D sensors like Microsoft Kinect which provide 3D colored point clouds at high frame rates made the computer vision even more relevant in the mobile robotics field. The combination of visual and 3D data allows the systems to use both computer vision and 3D processing and therefore to be aware of more details of the surrounding environment. The research described in this thesis was motivated by the need of scene mapping. Being aware of the surrounding environment is a key feature in many mobile robotics applications from simple robotic navigation to complex surveillance applications. In addition, the acquisition of a 3D model of the scenes is useful in many areas as video games scene modeling where well-known places are reconstructed and added to game systems or advertising where once you get the 3D model of one room the system can add furniture pieces using augmented reality techniques. In this thesis we perform an experimental study of the state-of-the-art registration methods to find which one fits better to our scene mapping purposes. Different methods are tested and analyzed on different scene distributions of visual and geometry appearance. In addition, this thesis proposes two methods for 3d data compression and representation of 3D maps. Our 3D representation proposal is based on the use of Growing Neural Gas (GNG) method. This Self-Organizing Maps (SOMs) has been successfully used for clustering, pattern recognition and topology representation of various kind of data. Until now, Self-Organizing Maps have been primarily computed offline and their application in 3D data has mainly focused on free noise models without considering time constraints. Self-organising neural models have the ability to provide a good representation of the input space. In particular, the Growing Neural Gas (GNG) is a suitable model because of its flexibility, rapid adaptation and excellent quality of representation. However, this type of learning is time consuming, specially for high-dimensional input data. Since real applications often work under time constraints, it is necessary to adapt the learning process in order to complete it in a predefined time. This thesis proposes a hardware implementation leveraging the computing power of modern GPUs which takes advantage of a new paradigm coined as General-Purpose Computing on Graphics Processing Units (GPGPU). Our proposed geometrical 3D compression method seeks to reduce the 3D information using plane detection as basic structure to compress the data. This is due to our target environments are man-made and therefore there are a lot of points that belong to a plane surface. Our proposed method is able to get good compression results in those man-made scenarios. The detected and compressed planes can be also used in other applications as surface reconstruction or plane-based registration algorithms. Finally, we have also demonstrated the goodness of the GPU technologies getting a high performance implementation of a CAD/CAM common technique called Virtual Digitizing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Incidence and mortality from skin cancers including melanoma are highest among men 50 years or older. Thorough skin self-examination may be beneficial to improve skin cancers outcomes.--------- Objectives: To develop and conduct a randomized-controlled trial of a video-based intervention to improve skin self-examination behavior among men 50 years or older.--------- Methods: Pilot work ascertained appropriate targeting of the 12-minute intervention video towards men 50 years or older. Overall, 968 men were recruited and 929 completed baseline telephone assessment. Baseline analysis assessed randomization balance and demographic, skin cancer risk and attitudinal factors associated with conducting a whole-body skin self-examination or receiving a whole-body clinical skin examination by a doctor during the past 12 months.--------- Results: Randomization resulted in well-balanced intervention and control groups. Overall 13% of men reported conducting a thorough skin self-examination using a mirror or the help of another person to check difficult to see areas, while 39% reported having received a whole-body skin examination by a doctor within the past 12 months. Confidence in finding time for and receiving advice or instructions by a doctor to perform a skin self-examination were among the factors associated with thorough skin self-examination at baseline.---------- Conclusions: Men 50 years or older can successfully be recruited to a video-based intervention trial with the aim reduce their burden through skin cancer. Randomization by computer generated randomization list resulted in good balance between control and intervention group and baseline analysis determined factors associated with skin cancer early detection behavior at baseline.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the world’s population is growing, so is the demand for agricultural products. However, natural nitrogen (N) fixation and phosphorus (P) availability cannot sustain the rising agricultural production, thus, the application of N and P fertilisers as additional nutrient sources is common. It is those anthropogenic activities that can contribute high amounts of organic and inorganic nutrients to both surface and groundwaters resulting in degradation of water quality and a possible reduction of aquatic life. In addition, runoff and sewage from urban and residential areas can contain high amounts of inorganic and organic nutrients which may also affect water quality. For example, blooms of the cyanobacterium Lyngbya majuscula along the coastline of southeast Queensland are an indicator of at least short term decreases of water quality. Although Australian catchments, including those with intensive forms of land use, show in general a low export of nutrients compared to North American and European catchments, certain land use practices may still have a detrimental effect on the coastal environment. Numerous studies are reported on nutrient cycling and associated processes on a catchment scale in the Northern Hemisphere. Comparable studies in Australia, in particular in subtropical regions are, however, limited and there is a paucity in the data, in particular for inorganic and organic forms of nitrogen and phosphorus; these nutrients are important limiting factors in surface waters to promote algal blooms. Therefore, the monitoring of N and P and understanding the sources and pathways of these nutrients within a catchment is important in coastal zone management. Although Australia is the driest continent, in subtropical regions such as southeast Queensland, rainfall patterns have a significant effect on runoff and thus the nutrient cycle at a catchment scale. Increasingly, these rainfall patterns are becoming variable. The monitoring of these climatic conditions and the hydrological response of agricultural catchments is therefore also important to reduce the anthropogenic effects on surface and groundwater quality. This study consists of an integrated hydrological–hydrochemical approach that assesses N and P in an environment with multiple land uses. The main aim is to determine the nutrient cycle within a representative coastal catchment in southeast Queensland, the Elimbah Creek catchment. In particular, the investigation confirms the influence associated with forestry and agriculture on N and P forms, sources, distribution and fate in the surface and groundwaters of this subtropical setting. In addition, the study determines whether N and P are subject to transport into the adjacent estuary and thus into the marine environment; also considered is the effect of local topography, soils and geology on N and P sources and distribution. The thesis is structured on four components individually reported. The first paper determines the controls of catchment settings and processes on stream water, riverbank sediment, and shallow groundwater N and P concentrations, in particular during the extended dry conditions that were encountered during the study. Temporal and spatial factors such as seasonal changes, soil character, land use and catchment morphology are considered as well as their effect on controls over distributions of N and P in surface waters and associated groundwater. A total number of 30 surface and 13 shallow groundwater sampling sites were established throughout the catchment to represent dominant soil types and the land use upstream of each sampling location. Sampling comprises five rounds and was conducted over one year between October 2008 and November 2009. Surface water and groundwater samples were analysed for all major dissolved inorganic forms of N and for total N. Phosphorus was determined in the form of dissolved reactive P (predominantly orthophosphate) and total P. In addition, extracts of stream bank sediments and soil grab samples were analysed for these N and P species. Findings show that major storm events, in particular after long periods of drought conditions, are the driving force of N cycling. This is expressed by higher inorganic N concentrations in the agricultural subcatchment compared to the forested subcatchment. Nitrate N is the dominant inorganic form of N in both the surface and groundwaters and values are significantly higher in the groundwaters. Concentrations in the surface water range from 0.03 to 0.34 mg N L..1; organic N concentrations are considerably higher (average range: 0.33 to 0.85 mg N L..1), in particular in the forested subcatchment. Average NO3-N in the groundwater has a range of 0.39 to 2.08 mg N L..1, and organic N averages between 0.07 and 0.3 mg N L..1. The stream bank sediments are dominated by organic N (range: 0.53 to 0.65 mg N L..1), and the dominant inorganic form of N is NH4-N with values ranging between 0.38 and 0.41 mg N L..1. Topography and soils, however, were not to have a significant effect on N and P concentrations in waters. Detectable phosphorus in the surface and groundwaters of the catchment is limited to several locations typically in the proximity of areas with intensive animal use; in soil and sediments, P is negligible. In the second paper, the stable isotopes of N (14N/15N) and H2O (16O/18O and 2H/H) in surface and groundwaters are used to identify sources of dissolved inorganic and organic N in these waters, and to determine their pathways within the catchment; specific emphasis is placed on the relation of forestry and agriculture. Forestry is predominantly concentrated in the northern subcatchment (Beerburrum Creek) while agriculture is mainly found in the southern subcatchment (Six Mile Creek). Results show that agriculture (horticulture, crops, grazing) is the main source of inorganic N in the surface waters of the agricultural subcatchment, and their isotopic signature shows a close link to evaporation processes that may occur during water storage in farm dams that are used for irrigation. Groundwaters are subject to denitrification processes that may result in reduced dissolved inorganic N concentrations. Soil organic matter delivers most of the inorganic N to the surface water in the forested subcatchment. Here, precipitation and subsequently runoff is the main source of the surface waters. Groundwater in this area is affected by agricultural processes. The findings also show that the catchment can attenuate the effects of anthropogenic land use on surface water quality. Riparian strips of natural remnant vegetation, commonly 50 to 100 m in width, act as buffer zones along the drainage lines in the catchment and remove inorganic N from the soil water before it enters the creek. These riparian buffer zones are common in most agricultural catchments of southeast Queensland and are indicated to reduce the impact of agriculture on stream water quality and subsequently on the estuary and marine environments. This reduction is expressed by a significant decrease in DIN concentrations from 1.6 mg N L..1 to 0.09 mg N L..1, and a decrease in the �15N signatures from upstream surface water locations downstream to the outlet of the agricultural subcatchment. Further testing is, however, necessary to confirm these processes. Most importantly, the amount of N that is transported to the adjacent estuary is shown to be negligible. The third and fourth components of the thesis use a hydrological catchment model approach to determine the water balance of the Elimbah Creek catchment. The model is then used to simulate the effects of land use on the water balance and nutrient loads of the study area. The tool that is used is the internationally widely applied Soil and Water Assessment Tool (SWAT). Knowledge about the water cycle of a catchment is imperative in nutrient studies as processes such as rainfall, surface runoff, soil infiltration and routing of water through the drainage system are the driving forces of the catchment nutrient cycle. Long-term information about discharge volumes of the creeks and rivers do, however, not exist for a number of agricultural catchments in southeast Queensland, and such information is necessary to calibrate and validate numerical models. Therefore, a two-step modelling approach was used to calibrate and validate parameters values from a near-by gauged reference catchment as starting values for the ungauged Elimbah Creek catchment. Transposing monthly calibrated and validated parameter values from the reference catchment to the ungauged catchment significantly improved model performance showing that the hydrological model of the catchment of interest is a strong predictor of the water water balance. The model efficiency coefficient EF shows that 94% of the simulated discharge matches the observed flow whereas only 54% of the observed streamflow was simulated by the SWAT model prior to using the validated values from the reference catchment. In addition, the hydrological model confirmed that total surface runoff contributes the majority of flow to the surface water in the catchment (65%). Only a small proportion of the water in the creek is contributed by total base-flow (35%). This finding supports the results of the stable isotopes 16O/18O and 2H/H, which show the main source of water in the creeks is either from local precipitation or irrigation waters delivered by surface runoff; a contribution from the groundwater (baseflow) to the creeks could not be identified using 16O/18O and 2H/H. In addition, the SWAT model calculated that around 68% of the rainfall occurring in the catchment is lost through evapotranspiration reflecting the prevailing long-term drought conditions that were observed prior and during the study. Stream discharge from the forested subcatchment was an order of magnitude lower than discharge from the agricultural Six Mile Creek subcatchment. A change in land use from forestry to agriculture did not significantly change the catchment water balance, however, nutrient loads increased considerably. Conversely, a simulated change from agriculture to forestry resulted in a significant decrease of nitrogen loads. The findings of the thesis and the approach used are shown to be of value to catchment water quality monitoring on a wider scale, in particular the implications of mixed land use on nutrient forms, distributions and concentrations. The study confirms that in the tropics and subtropics the water balance is affected by extended dry periods and seasonal rainfall with intensive storm events. In particular, the comprehensive data set of inorganic and organic N and P forms in the surface and groundwaters of this subtropical setting acquired during the one year sampling program may be used in similar catchment hydrological studies where these detailed information is missing. Also, the study concludes that riparian buffer zones along the catchment drainage system attenuate the transport of nitrogen from agricultural sources in the surface water. Concentrations of N decreased from upstream to downstream locations and were negligible at the outlet of the catchment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For active contour modeling (ACM), we propose a novel self-organizing map (SOM)-based approach, called the batch-SOM (BSOM), that attempts to integrate the advantages of SOM- and snake-based ACMs in order to extract the desired contours from images. We employ feature points, in the form of ail edge-map (as obtained from a standard edge-detection operation), to guide the contour (as in the case of SOM-based ACMs) along with the gradient and intensity variations in a local region to ensure that the contour does not "leak" into the object boundary in case of faulty feature points (weak or broken edges). In contrast with the snake-based ACMs, however, we do not use an explicit energy functional (based on gradient or intensity) for controlling the contour movement. We extend the BSOM to handle extraction of contours of multiple objects, by splitting a single contour into as many subcontours as the objects in the image. The BSOM and its extended version are tested on synthetic binary and gray-level images with both single and multiple objects. We also demonstrate the efficacy of the BSOM on images of objects having both convex and nonconvex boundaries. The results demonstrate the superiority of the BSOM over others. Finally, we analyze the limitations of the BSOM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Processes of enrichment, concentration and retention are thought to be important for the successful recruitment of small pelagic fish in upwelling areas, but are difficult to measure. In this study, a novel approach is used to examine the role of spatio-temporal oceanographic variability on recruitment success of the Northern Benguela sardine Sardinops sagax. This approach applies a neural network pattern recognition technique, called a self-organising map (SOM), to a seven-year time series of satellite-derived sea level data. The Northern Benguela is characterised by quasi-perennial upwelling of cold, nutrient-rich water and is influenced by intrusions of warm, nutrient-poor Angola Current water from the north. In this paper, these processes are categorised in terms of their influence on recruitment success through the key ocean triad mechanisms of enrichment, concentration and retention. Moderate upwelling is seen as favourable for recruitment, whereas strong upwelling, weak upwelling and Angola Current intrusion appear detrimental to recruitment success. The SOM was used to identify characteristic patterns from sea level difference data and these were interpreted with the aid of sea surface temperature data. We found that the major oceanographic processes of upwelling and Angola Current intrusion dominated these patterns, allowing them to be partitioned into those representing recruitment favourable conditions and those representing adverse conditions for recruitment. A marginally significant relationship was found between the index of sardine recruitment and the frequency of recruitment favourable conditions (r super(2) = 0.61, p = 0.068, n = 6). Because larvae are vulnerable to environmental influences for a period of at least 50 days after spawning, the SOM was then used to identify windows of persistent favourable conditions lasting longer than 50 days, termed recruitment favourable periods (RFPs). The occurrence of RFPs was compared with back-calculated spawning dates for each cohort. Finally, a comparison of RFPs with the time of spawning and the index of recruitment showed that in years where there were 50 or more days of favourable conditions following spawning, good recruitment followed (Mann-Whitney U-test: p = 0.064, n = 6). These results show the value of the SOM technique for describing spatio-temporal variability in oceanographic processes. Variability in these processes appears to be an important factor influencing recruitment in the Northern Benguela sardine, although the available data time series is currently too short to be conclusive. Nonetheless, the analysis of satellite data, using a neural network pattern-recognition approach, provides a useful framework for investigating fisheries recruitment problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been significant interest in the methodologies of controlled release for a diverse range of applications spanning drug delivery, biological and chemical sensors, and diagnostics. The advancement in novel substrate-polymer coupling moieties has led to the discovery of self-immolative linkers. This new class of linker has gained popularity in recent years in polymeric release technology as a result of stable bond formation between protecting and leaving groups, which becomes labile upon activation, leading to the rapid disassembly of the parent polymer. This ability has prompted numerous studies into the design and development of self-immolative linkers and the kinetics surrounding their disassembly. This review details the main concepts that underpin self-immolative linker technologies that feature in polymeric or dendritic conjugate systems and outlines the chemistries of amplified self-immolative elimination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this thesis work is the need for improving reliability of equipment and quality of service to railway passengers as well as a requirement for cost-effective and efficient condition maintenance management for rail transportation. This thesis work develops a fusion of various machine vision analysis methods to achieve high performance in automation of wooden rail track inspection.The condition monitoring in rail transport is done manually by a human operator where people rely on inference systems and assumptions to develop conclusions. The use of conditional monitoring allows maintenance to be scheduled, or other actions to be taken to avoid the consequences of failure, before the failure occurs. Manual or automated condition monitoring of materials in fields of public transportation like railway, aerial navigation, traffic safety, etc, where safety is of prior importance needs non-destructive testing (NDT).In general, wooden railway sleeper inspection is done manually by a human operator, by moving along the rail sleeper and gathering information by visual and sound analysis for examining the presence of cracks. Human inspectors working on lines visually inspect wooden rails to judge the quality of rail sleeper. In this project work the machine vision system is developed based on the manual visual analysis system, which uses digital cameras and image processing software to perform similar manual inspections. As the manual inspection requires much effort and is expected to be error prone sometimes and also appears difficult to discriminate even for a human operator by the frequent changes in inspected material. The machine vision system developed classifies the condition of material by examining individual pixels of images, processing them and attempting to develop conclusions with the assistance of knowledge bases and features.A pattern recognition approach is developed based on the methodological knowledge from manual procedure. The pattern recognition approach for this thesis work was developed and achieved by a non destructive testing method to identify the flaws in manually done condition monitoring of sleepers.In this method, a test vehicle is designed to capture sleeper images similar to visual inspection by human operator and the raw data for pattern recognition approach is provided from the captured images of the wooden sleepers. The data from the NDT method were further processed and appropriate features were extracted.The collection of data by the NDT method is to achieve high accuracy in reliable classification results. A key idea is to use the non supervised classifier based on the features extracted from the method to discriminate the condition of wooden sleepers in to either good or bad. Self organising map is used as classifier for the wooden sleeper classification.In order to achieve greater integration, the data collected by the machine vision system was made to interface with one another by a strategy called fusion. Data fusion was looked in at two different levels namely sensor-level fusion, feature- level fusion. As the goal was to reduce the accuracy of the human error on the rail sleeper classification as good or bad the results obtained by the feature-level fusion compared to that of the results of actual classification were satisfactory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Patient education and self-management programs are offered in many countries to people with chronic conditions such as osteoarthritis (OA). The most well-known is the disease-specific Stanford Arthritis Self-Management Program (ASMP). While Australian and international clinical guidelines promote the concept of self-management for OA, there is currently little evidence to support the use of the ASMP. Several meta-analyses have reported that arthritis self-management programs had minimal or no effect on reducing pain and disability. However, previous studies have had methodological shortcomings including the use of outcome measures which do not accurately reflect program goals. Additionally, limited cost-effectiveness analyses have been undertaken and the cost-utility of the program has not been explored.

Methods/design: This study is a randomised controlled trial to determine the efficacy (in terms of Health-Related Quality of Life and self-management skills) and cost-utility of a 6-week group-based Stanford ASMP for people with hip or knee OA.

Six hundred participants referred to an orthopaedic surgeon or rheumatologist for hip or knee OA will be recruited from outpatient clinics at 2 public hospitals and community-based private practices within 2 private hospital settings in Victoria, Australia. Participants must be 18 years or over, fluent in English and able to attend ASMP sessions. Exclusion criteria include cognitive dysfunction, previous participation in self-management programs and placement on a waiting list for joint replacement surgery or scheduled joint replacement.

Eligible, consenting participants will be randomised to an intervention group (who receive the ASMP and an arthritis self-management book) or a control group (who receive the book only). Follow-up will be at 6 weeks, 3 months and 12 months using standardised self-report measures. The primary outcome is Health-Related Quality of Life at 12 months, measured using the Assessment of Quality of Life instrument. Secondary outcome measures include the Health Education Impact Questionnaire, Western Ontario and McMaster Universities Osteoarthritis Index (pain subscale and total scores), Kessler Psychological Distress Scale and the Hip and Knee Multi-Attribute Priority Tool. Cost-utility analyses will be undertaken using administrative records and self-report data. A subgroup of 100 participants will undergo qualitative interviews to explore the broader potential impacts of the ASMP.

Discussion:
Using an innovative design combining both quantitative and qualitative components, this project will provide high quality data to facilitate evidence-based recommendations regarding the ASMP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Self-management is seen as a primary mechanism to support the optimization of care for people with chronic diseases such as symptomatic vascular disease. There are no established and evidence-based stroke-specific chronic disease self-management programs. Our aim is to evaluate whether a stroke-specific program is safe and feasible as part of a Phase II randomized-controlled clinical trial.
Methods Stroke survivors are recruited from a variety of sources including: hospital stroke services, local paper advertisements, Stroke South Australia newsletter (volunteer peer support organization), Divisions of General Practice, and community service providers across Adelaide, South Australia. Subjects are invited to participate in a multi-center, single-blind, randomized, controlled trial. Eligible participants are randomized to either;
• standard care,
• standard care plus a six week generic chronic condition self-management group education program, or,
• standard care plus an eight week stroke specific self-management education group program.
Interventions are conducted after discharge from hospital. Participants are assessed at baseline, immediate post intervention and six months.
Study Outcomes The primary outcome measures determine study feasibility and safety, measuring, recruitment, participation, compliance and adverse events.
Secondary outcomes include:
• positive and active engagement in life measured by the Health Education Impact Questionnaire,
• improvements in quality of life measured by the Assessment of Quality of Life instrument,
• improvements in mood measured by the Irritability, Depression and Anxiety Scale,
• health resource utilization measured by a participant held diary and safety.

Conclusion The results of this study will determine whether a definitive Phase III efficacy trial is justified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Musical preference has long been a research interest in the field of music education, and studies consistently confirm the importance of musical preference in one’s musical learning experiences. However, only a limited number of studies have been focussed on the field of early childhood education (e.g., Hargreaves, North, & Tarrant, 2006; Roulston, 2006). Further, among these limited early childhood studies, few of them discuss children’s musical preference in both the East and the West. There is very limited literature (e.g., Faulkner et al., 2010; Szymanska, 2012) which explores the data by using a data mining approach. This study aims to bridge the research gaps by examining children’s musical preference in Hong Kong and in South Australia by applying a data mining technique – Self Organising Maps (SOM), which is a clustering method that groups similar data objects together. The application of SOM is new in the field of early childhood education and also in the study of children’s musical preference. This paper specifically aims to expand a previous study (Yim & Ebbeck, 2009) by conducting deeper investigations into the existing datasets, for the purpose of uncovering insights that have not been identified through data mining approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper artificial neural network (ANN) based on supervised and unsupervised algorithms were investigated for use in the study of rheological parameters of solid pharmaceutical excipients, in order to develop computational tools for manufacturing solid dosage forms. Among four supervised neural networks investigated, the best learning performance was achieved by a feedfoward multilayer perceptron whose architectures was composed by eight neurons in the input layer, sixteen neurons in the hidden layer and one neuron in the output layer. Learning and predictive performance relative to repose angle was poor while to Carr index and Hausner ratio (CI and HR, respectively) showed very good fitting capacity and learning, therefore HR and CI were considered suitable descriptors for the next stage of development of supervised ANNs. Clustering capacity was evaluated for five unsupervised strategies. Network based on purely unsupervised competitive strategies, classic "Winner-Take-All", "Frequency-Sensitive Competitive Learning" and "Rival-Penalize Competitive Learning" (WTA, FSCL and RPCL, respectively) were able to perform clustering from database, however this classification was very poor, showing severe classification errors by grouping data with conflicting properties into the same cluster or even the same neuron. On the other hand it could not be established what was the criteria adopted by the neural network for those clustering. Self-Organizing Maps (SOM) and Neural Gas (NG) networks showed better clustering capacity. Both have recognized the two major groupings of data corresponding to lactose (LAC) and cellulose (CEL). However, SOM showed some errors in classify data from minority excipients, magnesium stearate (EMG) , talc (TLC) and attapulgite (ATP). NG network in turn performed a very consistent classification of data and solve the misclassification of SOM, being the most appropriate network for classifying data of the study. The use of NG network in pharmaceutical technology was still unpublished. NG therefore has great potential for use in the development of software for use in automated classification systems of pharmaceutical powders and as a new tool for mining and clustering data in drug development

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Comunicación presentada en el 2nd International Workshop on Pattern Recognition in Information Systems, Alicante, April, 2002.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recall of personal experiences relevant to a claim of food allergy or food intolerance is assessed by a psychologically validated tool for evidence that the suspected food could have caused the adverse symptom suffered. The tool looks at recall from memory of a particular episode or episodes when food was followed by symptoms resulting in self-diagnosis of food allergy or intolerance compared to merely theoretical knowledge that such symptoms could arise after eating the food. If there is detailed recall of events that point to the food as a potential cause of the symptom and the symptom is sufficiently serious, the tool user is recommended to seek testing at an allergy clinic or by the appropriate specialist for a non-allergic sensitivity. If what is recalled does not support the logical possibility of a causal connection between eating that food and occurrence of the symptom, then the user of the tool is pointed to other potential sources of the problem. The user is also recommended to investigate remedies other than avoidance of the food that had been blamed.