834 resultados para Network Analysis Methods
Resumo:
This paper presents an automated image‐based safety assessment method for earthmoving and surface mining activities. The literature review revealed the possible causes of accidents on earthmoving operations, investigated the spatial risk factors of these types of accident, and identified spatial data needs for automated safety assessment based on current safety regulations. Image‐based data collection devices and algorithms for safety assessment were then evaluated. Analysis methods and rules for monitoring safety violations were also discussed. The experimental results showed that the safety assessment method collected spatial data using stereo vision cameras, applied object identification and tracking algorithms, and finally utilized identified and tracked object information for safety decision making.
Resumo:
Films found on the windows of residential buildings have been studied. The main aim of the paper was to assess the roles of the films in the accumulation of potentially toxic chemicals in residential buildings. Thus the elemental and polycyclic aromatic hydrocarbon compositions of the surface films from the glass windows of eighteen residential buildings were examined. The presence of sample amounts of inorganic elements (4.0–1.2 × 106 μg m−2) and polycyclic aromatic hydrocarbons in the films (BDL - 620.1 ng m−2) has implications for human exposure and the fate of pollutants in the urban environment. To facilitate the interpretation of the results, data matrices consisting of the chemical composition of the films and the building characteristics were subjected to multivariate data analysis methods, and these revealed that the accumulation of the chemicals was strongly dependent on building characteristics such as the type of glass used for the window, the distance from a major road, age of the building, distance from an industrial activity, number of smokers in the building and frequency of cooking in the buildings. Thus, building characteristics which minimize the accumulation of pollutants on the surface films need to be encouraged.
Resumo:
In this paper, a comprehensive planning methodology is proposed that can minimize the line loss, maximize the reliability and improve the voltage profile in a distribution network. The injected active and reactive power of Distributed Generators (DG) and the installed capacitor sizes at different buses and for different load levels are optimally controlled. The tap setting of HV/MV transformer along with the line and transformer upgrading is also included in the objective function. A hybrid optimization method, called Hybrid Discrete Particle Swarm Optimization (HDPSO), is introduced to solve this nonlinear and discrete optimization problem. The proposed HDPSO approach is a developed version of DPSO in which the diversity of the optimizing variables is increased using the genetic algorithm operators to avoid trapping in local minima. The objective function is composed of the investment cost of DGs, capacitors, distribution lines and HV/MV transformer, the line loss, and the reliability. All of these elements are converted into genuine dollars. Given this, a single-objective optimization method is sufficient. The bus voltage and the line current as constraints are satisfied during the optimization procedure. The IEEE 18-bus test system is modified and employed to evaluate the proposed algorithm. The results illustrate the unavoidable need for optimal control on the DG active and reactive power and capacitors in distribution networks.
Resumo:
Entrepreneurial Orientation (EO) has a 30 year history as one of the most used concepts in entrepreneurship research. “Recent attention in formal sessions at the Academy of Management conference programs confirm Entrepreneurial Orientation as a primary construct with a majority of Entrepreneurship Division sponsored sessions devoted to studies using EO related measures”, as reported by the 2010 division program chair, Per Davidson (Roberts, 2010: 9). However, questions continue to be raised concerning over-dependence on parts of one strategic scale, possible inappropriate or under-theorized adaptations, and the lack of theoretical development on application and performance variance in emergent, organizational, and socioeconomic settings. One recent area of investigation in analysis, methods, theory and application concerns an “EO gestalt”, focusing on the family of EO-related measures and theory, rather than on one or more dimensions, in order to explore the theory and process of the Entrepreneurial Orientation phenomenon. The goals of the 4th Annual EO3 PDW are to enlighten researchers on the development of Entrepreneurial Orientation theory and related scales, balance the use of Entrepreneurial Orientation current knowledge with new research frontiers suggested by EO3 scholars’ questions, and transcend boundaries in the discoveries undertaken in the shared interdisciplinary and cross-cultural research agenda currently developing for Entrepreneurial Orientation concepts. Going into its forth year, the EO3 PDW has been pivotal in formalizing discussion, pushing research forward, and gaining insights from experienced and cutting edge scholars, as it provides a point of reference for coalescing research questions and findings surrounding this important concept.
Resumo:
Although germline mutations in CDKN2A are present in approximately 25% of large multicase melanoma families, germline mutations are much rarer in the smaller melanoma families that make up most individuals reporting a family history of this disease. In addition, only three families worldwide have been reported with germline mutations in a gene other than CDKN2A (i.e., CDK4). Accordingly, current genomewide scans underway at the National Human Genome Research Institute hope to reveal linkage to one or more chromosomal regions, and ultimately lead to the identification of novel genes involved in melanoma predisposition. Both CDKN2A and PTEN have been identified as genes involved in sporadic melanoma development; however, mutations are more common in cell lines than uncultured tumors. A combination of cytogenetic, molecular, and functional studies suggests that additional genes involved in melanoma development are located to chromosomal regions 1p, 6q, 7p, 11q, and possibly also 9p and 10q. With the near completion of the human genome sequencing effort, combined with the advent of high throughput mutation analyses and new techniques including cDNA and tissue microarrays, the identification and characterization of additional genes involved in melanoma pathogenesis seem likely in the near future.
Resumo:
p53 is the central member of a critical tumor suppressor pathway in virtually all tumor types, where it is silenced mainly by missense mutations. In melanoma, p53 predominantly remains wild type, thus its role has been neglected. To study the effect of p53 on melanocyte function and melanomagenesis, we crossed the 'high-p53'Mdm4+/- mouse to the well-established TP-ras0/+ murine melanoma progression model. After treatment with the carcinogen dimethylbenzanthracene (DMBA), TP-ras0/+ mice on the Mdm4+/- background developed fewer tumors with a delay in the age of onset of melanomas compared to TP-ras0/+ mice. Furthermore, we observed a dramatic decrease in tumor growth, lack of metastasis with increased survival of TP-ras0/+: Mdm4+/- mice. Thus, p53 effectively prevented the conversion of small benign tumors to malignant and metastatic melanoma. p53 activation in cultured primary melanocyte and melanoma cell lines using Nutlin-3, a specific Mdm2 antagonist, supported these findings. Moreover, global gene expression and network analysis of Nutlin-3-treated primary human melanocytes indicated that cell cycle regulation through the p21WAF1/CIP1 signaling network may be the key anti-melanomagenic activity of p53.
Resumo:
Recent studies have shown that small genetic regulatory networks (GRNs) can be evolved in silico displaying certain dynamics in the underlying mathematical model. It is expected that evolutionary approaches can help to gain a better understanding of biological design principles and assist in the engineering of genetic networks. To take the stochastic nature of GRNs into account, our evolutionary approach models GRNs as biochemical reaction networks based on simple enzyme kinetics and simulates them by using Gillespie’s stochastic simulation algorithm (SSA). We have already demonstrated the relevance of considering intrinsic stochasticity by evolving GRNs that show oscillatory dynamics in the SSA but not in the ODE regime. Here, we present and discuss first results in the evolution of GRNs performing as stochastic switches.
Resumo:
Collaboration between academic and library faculty is an important topic of discussion and research among academic librarians. Partnerships are vital for developing effective information literacy education. The research reported in this paper aims to develop an understanding of academic collaborators by analyzing academic faculty’s teaching social network. Academic faculty teaching social networks have not been previously described through the lens of social network analysis. A teaching social network is comprised of people and their communication channels that affect academic faculty when they design and deliver their courses. Social network analysis was the methodology used to describe the teaching social networks. The preliminary results show academic faculty were more affected by the channels of communication in how they taught (pedagogy) than what they taught (course content). This study supplements the existing research on collaboration and information literacy. It provides both academic and library faculty with added insight into their relationships.
Resumo:
Collaboration has been enacted as a core strategy by both the government and nongovernment sectors to address many of the intractable issues confronting contemporary society. The cult of collaboration has become so pervasive that it is now an elastic term referring generally to any form of ‘working together’. The lack of specificity about collaboration and its practice means that it risks being reduced to mere rhetoric without sustained practice or action. Drawing on an extensive data set (qualitative, quantitative) of broadly collaborative endeavours gathered over ten years in Queensland, Australia, this paper aims to fill out the black box of collaboration. Specifically it examines the drivers for collaboration, dominant structures and mechanisms adopted, what has worked and unintended consequences. In particular it investigates the skills and competencies required in an embeded collaborative endeavour within and across organisations. Social network analysis is applied to isolate the structural properties of collaborations over other forms of integration as well as highlighting key roles and tasks. Collaboration is found to be a distinctive form of working together, characterised by intense and interdependent relationships and exchanges, higher levels of cohesion (density) and requiring new ways of behaving, working, managing and leading. These elements are configured into a practice framework. Developing an empirical evidence base for collaboration structure, practice and strategy provides a useful foundation for theory extension. The paper concludes that for collaboration, to be successfully employed as a management strategy it must move beyond rhetoric and develop a coherent model for action.
Resumo:
Handling information overload online, from the user's point of view is a big challenge, especially when the number of websites is growing rapidly due to growth in e-commerce and other related activities. Personalization based on user needs is the key to solving the problem of information overload. Personalization methods help in identifying relevant information, which may be liked by a user. User profile and object profile are the important elements of a personalization system. When creating user and object profiles, most of the existing methods adopt two-dimensional similarity methods based on vector or matrix models in order to find inter-user and inter-object similarity. Moreover, for recommending similar objects to users, personalization systems use the users-users, items-items and users-items similarity measures. In most cases similarity measures such as Euclidian, Manhattan, cosine and many others based on vector or matrix methods are used to find the similarities. Web logs are high-dimensional datasets, consisting of multiple users, multiple searches with many attributes to each. Two-dimensional data analysis methods may often overlook latent relationships that may exist between users and items. In contrast to other studies, this thesis utilises tensors, the high-dimensional data models, to build user and object profiles and to find the inter-relationships between users-users and users-items. To create an improved personalized Web system, this thesis proposes to build three types of profiles: individual user, group users and object profiles utilising decomposition factors of tensor data models. A hybrid recommendation approach utilising group profiles (forming the basis of a collaborative filtering method) and object profiles (forming the basis of a content-based method) in conjunction with individual user profiles (forming the basis of a model based approach) is proposed for making effective recommendations. A tensor-based clustering method is proposed that utilises the outcomes of popular tensor decomposition techniques such as PARAFAC, Tucker and HOSVD to group similar instances. An individual user profile, showing the user's highest interest, is represented by the top dimension values, extracted from the component matrix obtained after tensor decomposition. A group profile, showing similar users and their highest interest, is built by clustering similar users based on tensor decomposed values. A group profile is represented by the top association rules (containing various unique object combinations) that are derived from the searches made by the users of the cluster. An object profile is created to represent similar objects clustered on the basis of their similarity of features. Depending on the category of a user (known, anonymous or frequent visitor to the website), any of the profiles or their combinations is used for making personalized recommendations. A ranking algorithm is also proposed that utilizes the personalized information to order and rank the recommendations. The proposed methodology is evaluated on data collected from a real life car website. Empirical analysis confirms the effectiveness of recommendations made by the proposed approach over other collaborative filtering and content-based recommendation approaches based on two-dimensional data analysis methods.
Resumo:
The improvement and optimization of business processes is one of the top priorities in an organization. Although process analysis methods are mature today, business analysts and stakeholders are still hampered by communication issues. That is, analysts cannot effectively obtain accurate business requirements from stakeholders, and stakeholders are often confused about analytic results offered by analysts. We argue that using a virtual world to model a business process can benefit communication activities. We believe that virtual worlds can be used as an efficient model-view approach, increasing the cognition of business requirements and analytic results, as well as the possibility of business plan validation. A healthcare case study is provided as an approach instance, illustrating how intuitive such an approach can be. As an exploration paper, we believe that this promising research can encourage people to investigate more research topics in the interdisciplinary area of information system, visualization and multi-user virtual worlds.
Resumo:
Accurate and detailed road models play an important role in a number of geospatial applications, such as infrastructure planning, traffic monitoring, and driver assistance systems. In this thesis, an integrated approach for the automatic extraction of precise road features from high resolution aerial images and LiDAR point clouds is presented. A framework of road information modeling has been proposed, for rural and urban scenarios respectively, and an integrated system has been developed to deal with road feature extraction using image and LiDAR analysis. For road extraction in rural regions, a hierarchical image analysis is first performed to maximize the exploitation of road characteristics in different resolutions. The rough locations and directions of roads are provided by the road centerlines detected in low resolution images, both of which can be further employed to facilitate the road information generation in high resolution images. The histogram thresholding method is then chosen to classify road details in high resolution images, where color space transformation is used for data preparation. After the road surface detection, anisotropic Gaussian and Gabor filters are employed to enhance road pavement markings while constraining other ground objects, such as vegetation and houses. Afterwards, pavement markings are obtained from the filtered image using the Otsu's clustering method. The final road model is generated by superimposing the lane markings on the road surfaces, where the digital terrain model (DTM) produced by LiDAR data can also be combined to obtain the 3D road model. As the extraction of roads in urban areas is greatly affected by buildings, shadows, vehicles, and parking lots, we combine high resolution aerial images and dense LiDAR data to fully exploit the precise spectral and horizontal spatial resolution of aerial images and the accurate vertical information provided by airborne LiDAR. Objectoriented image analysis methods are employed to process the feature classiffcation and road detection in aerial images. In this process, we first utilize an adaptive mean shift (MS) segmentation algorithm to segment the original images into meaningful object-oriented clusters. Then the support vector machine (SVM) algorithm is further applied on the MS segmented image to extract road objects. Road surface detected in LiDAR intensity images is taken as a mask to remove the effects of shadows and trees. In addition, normalized DSM (nDSM) obtained from LiDAR is employed to filter out other above-ground objects, such as buildings and vehicles. The proposed road extraction approaches are tested using rural and urban datasets respectively. The rural road extraction method is performed using pan-sharpened aerial images of the Bruce Highway, Gympie, Queensland. The road extraction algorithm for urban regions is tested using the datasets of Bundaberg, which combine aerial imagery and LiDAR data. Quantitative evaluation of the extracted road information for both datasets has been carried out. The experiments and the evaluation results using Gympie datasets show that more than 96% of the road surfaces and over 90% of the lane markings are accurately reconstructed, and the false alarm rates for road surfaces and lane markings are below 3% and 2% respectively. For the urban test sites of Bundaberg, more than 93% of the road surface is correctly reconstructed, and the mis-detection rate is below 10%.
Resumo:
IT-supported field data management benefits on-site construction management by improving accessibility to the information and promoting efficient communication between project team members. However, most of on-site safety inspections still heavily rely on subjective judgment and manual reporting processes and thus observers’ experiences often determine the quality of risk identification and control. This study aims to develop a methodology to efficiently retrieve safety-related information so that the safety inspectors can easily access to the relevant site safety information for safer decision making. The proposed methodology consists of three stages: (1) development of a comprehensive safety database which contains information of risk factors, accident types, impact of accidents and safety regulations; (2) identification of relationships among different risk factors based on statistical analysis methods; and (3) user-specified information retrieval using data mining techniques for safety management. This paper presents an overall methodology and preliminary results of the first stage research conducted with 101 accident investigation reports.
Resumo:
Background Barmah Forest virus (BFV) disease is a common and wide-spread mosquito-borne disease in Australia. This study investigated the spatio-temporal patterns of BFV disease in Queensland, Australia using geographical information system (GIS) tools and geostatistical analysis. Methods/Principal Findings We calculated the incidence rates and standardised incidence rates of BFV disease. Moran's I statistic was used to assess the spatial autocorrelation of BFV incidences. Spatial dynamics of BFV disease was examined using semi-variogram analysis. Interpolation techniques were applied to visualise and display the spatial distribution of BFV disease in statistical local areas (SLAs) throughout Queensland. Mapping of BFV disease by SLAs reveals the presence of substantial spatio-temporal variation over time. Statistically significant differences in BFV incidence rates were identified among age groups (χ2 = 7587, df = 7327,p<0.01). There was a significant positive spatial autocorrelation of BFV incidence for all four periods, with the Moran's I statistic ranging from 0.1506 to 0.2901 (p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the state. Conclusions/Significance This is the first study to examine spatial and temporal variation in the incidence rates of BFV disease across Queensland using GIS and geostatistics. The BFV transmission varied with age and gender, which may be due to exposure rates or behavioural risk factors. There are differences in the spatio-temporal patterns of BFV disease which may be related to local socio-ecological and environmental factors. These research findings may have implications in the BFV disease control and prevention programs in Queensland.
Resumo:
IT-supported field data management benefits on-site construction management by improving accessibility to the information and promoting efficient communication between project team members. However, most of on-site safety inspections still heavily rely on subjective judgment and manual reporting processes and thus observers’ experiences often determine the quality of risk identification and control. This study aims to develop a methodology to efficiently retrieve safety-related information so that the safety inspectors can easily access to the relevant site safety information for safer decision making. The proposed methodology consists of three stages: (1) development of a comprehensive safety database which contains information of risk factors, accident types, impact of accidents and safety regulations; (2) identification of relationships among different risk factors based on statistical analysis methods; and (3) user-specified information retrieval using data mining techniques for safety management. This paper presents an overall methodology and preliminary results of the first stage research conducted with 101 accident investigation reports.