787 resultados para Gradient-based approaches


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Android is becoming ubiquitous and currently has the largest share of the mobile OS market with billions of application downloads from the official app market. It has also become the platform most targeted by mobile malware that are becoming more sophisticated to evade state-of-the-art detection approaches. Many Android malware families employ obfuscation techniques in order to avoid detection and this may defeat static analysis based approaches. Dynamic analysis on the other hand may be used to overcome this limitation. Hence in this paper we propose DynaLog, a dynamic analysis based framework for characterizing Android applications. The framework provides the capability to analyse the behaviour of applications based on an extensive number of dynamic features. It provides an automated platform for mass analysis and characterization of apps that is useful for quickly identifying and isolating malicious applications. The DynaLog framework leverages existing open source tools to extract and log high level behaviours, API calls, and critical events that can be used to explore the characteristics of an application, thus providing an extensible dynamic analysis platform for detecting Android malware. DynaLog is evaluated using real malware samples and clean applications demonstrating its capabilities for effective analysis and detection of malicious applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The sudden change in environmental munificence level in the construction sector
during the period 2007 – 2015 provides a natural experiment to investigate strategic
and operating actions of firms, particularly during an environmental jolt. Statistics on
business failures corroborate that neither academics nor practitioners have succeeded
in guiding strategic action during periods of environmental jolt. Despite the recent
increase of turnaround research in the general management domain, its use in the
construction management realm remains underexplored. To address this research
gap, five exploratory case studies of an ongoing PhD study were used to examine the
turnaround strategies of construction contractors during a period of economic
contraction and growth. The findings show that, although retrenchment is often
considered to be a short-term strategy, this is clearly not the case; with the majority of
contractors maintaining the strategy for 6-7 years. During the same period,
internationalization became critical, with the turnaround process shifting towards
strategic reorientation that altered the firms' market domain. The case studies further
suggest that strategic and operational actions resonate quite well with contemporary
practice-based approaches to strategy making. The findings provide valuable
assistance for construction contractors in dealing with organisational decline and in
developing a successful turnaround response.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The book chapter examines the conundrums and contradictions for PSNI in delivering their community policing agenda within a post-conflict environment which simultaneously demands the delivery of counter-terrorism policing in view of the current dissident terrorist threat.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This multi-perspectival Interpretive Phenomenological Analysis (IPA) study explored how people in the ‘networks of concern’ talked about how they tried to make sense of the challenging behaviours of four children with severe learning disabilities. The study also aimed to explore what affected relationships between people. The study focussed on 4 children through interviewing their mothers, their teachers and the Camhs Learning Disability team members who were working with them. Two fathers also joined part of the interviews. All interviews were conducted separately using a semi-structured approach. IPA allowed both a consideration of the participant’s lived experiences and ‘objects of concern’ and a deconstruction of the multiple contexts of people’s lives, with a particular focus on disability. The analysis rendered five themes: the importance of love and affection, the difficulties, and the differences of living with a challenging child, the importance of being able to make sense of the challenges and the value of good relationships between people. Findings were interpreted through the lens of CMM (Coordinated Management of Meaning), which facilitated a systemic deconstruction and reconstruction of the findings. The research found that making sense of the challenges was a key concern for parents. Sharing meanings were important for people’s relationships with each other, including employing diagnostic and behavioural narratives. The importance of context is also highlighted including a consideration of how societal views of disability have an influence on people in the ‘network of concern’ around the child. A range of systemic approaches, methods and techniques are suggested as one way of improving services to these children and their families. It is suggested that adopting a ‘both/and’ position is important in such work - both applying evidence based approaches and being alert to and exploring the different ways people try and make sense of the children’s challenges. Implications for practice included helping professionals be alert to their constructions and professional narratives, slowing the pace with families, staying close to the concerns of families and addressing network issues.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is increasing advocacy for inclusive community-based approaches to environmental management, and growing evidence that involving communities improves the sustainability of social-ecological systems. Most community-based approaches rely on partnerships and knowledge exchange between communities, civil society organizations, and professionals such as practitioners and/or scientists. However, few models have actively integrated more horizontal knowledge exchange from community to community. We reflect on the transferability of community owned solutions between indigenous communities by exploring challenges and achievements of community peer-to-peer knowledge exchange as a way of empowering communities to face up to local environmental and social challenges. Using participatory visual methods, indigenous communities of the North Rupununi (Guyana) identified and documented their community owned solutions through films and photostories. Indigenous researchers from this community then shared their solutions with six other communities that faced similar challenges within Guyana, Suriname, Venezuela, Colombia, French Guiana, and Brazil. They were supported by in-country civil society organizations and academics. We analyzed the impact of the knowledge exchange through interviews, field reports, and observations. Our results show that indigenous community members were significantly more receptive to solutions emerging from, and communicated by, other indigenous peoples, and that this approach was a significant motivating force for galvanizing communities to make changes in their community. We identified a range of enabling factors, such as building capacity for a shared conceptual and technical understanding, that strengthens the exchange between communities and contributes to a lasting impact. With national and international policy-makers mobilizing significant financial resources for biodiversity conservation and climate change mitigation, we argue that the promotion of community owned solutions through community peer-to-peer exchange may deliver more long-lasting, socially and ecologically integrated, and investment-effective strategies compared to top-down, expert led, and/or foreign-led initiatives.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hyperspectral sensors are being developed for remote sensing applications. These sensors produce huge data volumes which require faster processing and analysis tools. Vertex component analysis (VCA) has become a very useful tool to unmix hyperspectral data. It has been successfully used to determine endmembers and unmix large hyperspectral data sets without the use of any a priori knowledge of the constituent spectra. Compared with other geometric-based approaches VCA is an efficient method from the computational point of view. In this paper we introduce new developments for VCA: 1) a new signal subspace identification method (HySime) is applied to infer the signal subspace where the data set live. This step also infers the number of endmembers present in the data set; 2) after the projection of the data set onto the signal subspace, the algorithm iteratively projects the data set onto several directions orthogonal to the subspace spanned by the endmembers already determined. The new endmember signature corresponds to these extreme of the projections. The capability of VCA to unmix large hyperspectral scenes (real or simulated), with low computational complexity, is also illustrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present document deals with the optimization of shape of aerodynamic profiles -- The objective is to reduce the drag coefficient on a given profile without penalising the lift coefficient -- A set of control points defining the geometry are passed and parameterized as a B-Spline curve -- These points are modified automatically by means of CFD analysis -- A given shape is defined by an user and a valid volumetric CFD domain is constructed from this planar data and a set of user-defined parameters -- The construction process involves the usage of 2D and 3D meshing algorithms that were coupled into own- code -- The volume of air surrounding the airfoil and mesh quality are also parametrically defined -- Some standard NACA profiles were used by obtaining first its control points in order to test the algorithm -- Navier-Stokes equations were solved for turbulent, steady-state ow of compressible uids using the k-epsilon model and SIMPLE algorithm -- In order to obtain data for the optimization process an utility to extract drag and lift data from the CFD simulation was added -- After a simulation is run drag and lift data are passed to the optimization process -- A gradient-based method using the steepest descent was implemented in order to define the magnitude and direction of the displacement of each control point -- The control points and other parameters defined as the design variables are iteratively modified in order to achieve an optimum -- Preliminary results on conceptual examples show a decrease in drag and a change in geometry that obeys to aerodynamic behavior principles

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hyponatraemia, defined as a serum sodium concentration <135 mmol/l, is the most common disorder of body fuid and electrolyte balance encountered in clinical practice. It can lead to a wide spectrum of clinical symptoms, from subtle to severe or even life threatening, and is associated with increased mortality, morbidity and length of hospital stay in patients presenting with a range of conditions. Despite this, the management of patients remains problematic. The prevalence of hyponatraemia in widely different conditions and the fact that hyponatraemia is managed by clinicians with a broad variety of backgrounds have fostered diverse institution-and speciality-based approaches to diagnosis and treatment. To obtain a common and holistic view, the European Society of Intensive Care Medicine (ESICM), the European Society of Endocrinology (ESE) and the European Renal Association-European Dialysis and Transplant Association (ERA-EDTA), represented by European Renal Best Practice (ERBP), have developed the Clinical Practice Guideline on the diagnostic approach and treatment of hyponatraemia as a joint venture of three societies representing specialists with a natural interest in hyponatraemia. In addition to a rigorous approach to methodology and evaluation, we were keen to ensure that the document focused on patient-important outcomes and included utility for clinicians involved in everyday practice.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the last years, special attention has been devoted to food-induced allergies, from which hazelnut allergy is highlighted. Hazelnut is one of the most commonly consumed tree nuts, being largely used by the food industry in a wide variety of processed foods. It has been regarded as a food with potential health benefits, but also as a source of allergens capable of inducing mild to severe allergic reactions in sensitised individuals. Considering the great number of reports addressing hazelnut allergens, with an estimated increasing trend, this review intends to assemble all the relevant information available so far on the main issues: prevalence of tree nut allergy, clinical threshold levels, molecular characterisation of hazelnut allergens (Cor a 1, Cor a 2, Cor a 8, Cor a 9, Cor a 10, Cor a 11, Cor a 12, Cor a 14 and Cor a TLP) and their clinical relevance, and methodologies for hazelnut allergen detection in foods. A comprehensive overview on the current data about the molecular characterisation of hazelnut allergens is presented, relating biochemical classification and biological function with clinical importance. Recent advances on hazelnut allergen detection methodologies are summarised and compared, including all the novel protein- and DNA-based approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The goal of image retrieval and matching is to find and locate object instances in images from a large-scale image database. While visual features are abundant, how to combine them to improve performance by individual features remains a challenging task. In this work, we focus on leveraging multiple features for accurate and efficient image retrieval and matching. We first propose two graph-based approaches to rerank initially retrieved images for generic image retrieval. In the graph, vertices are images while edges are similarities between image pairs. Our first approach employs a mixture Markov model based on a random walk model on multiple graphs to fuse graphs. We introduce a probabilistic model to compute the importance of each feature for graph fusion under a naive Bayesian formulation, which requires statistics of similarities from a manually labeled dataset containing irrelevant images. To reduce human labeling, we further propose a fully unsupervised reranking algorithm based on a submodular objective function that can be efficiently optimized by greedy algorithm. By maximizing an information gain term over the graph, our submodular function favors a subset of database images that are similar to query images and resemble each other. The function also exploits the rank relationships of images from multiple ranked lists obtained by different features. We then study a more well-defined application, person re-identification, where the database contains labeled images of human bodies captured by multiple cameras. Re-identifications from multiple cameras are regarded as related tasks to exploit shared information. We apply a novel multi-task learning algorithm using both low level features and attributes. A low rank attribute embedding is joint learned within the multi-task learning formulation to embed original binary attributes to a continuous attribute space, where incorrect and incomplete attributes are rectified and recovered. To locate objects in images, we design an object detector based on object proposals and deep convolutional neural networks (CNN) in view of the emergence of deep networks. We improve a Fast RCNN framework and investigate two new strategies to detect objects accurately and efficiently: scale-dependent pooling (SDP) and cascaded rejection classifiers (CRC). The SDP improves detection accuracy by exploiting appropriate convolutional features depending on the scale of input object proposals. The CRC effectively utilizes convolutional features and greatly eliminates negative proposals in a cascaded manner, while maintaining a high recall for true objects. The two strategies together improve the detection accuracy and reduce the computational cost.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Uncovering mechanisms of unknown pathological mechanisms and body response to applied medication are the drive forces toward personalized medicine. In this post-genomic era, all eyes are tuned to proteomic field, searching for the answers and explanations by investigating the final physiological functional units – proteins and their proteoforms. Development of cutting-edge mass spectrometric technologies and powerful bioinformatics tools, allowed life-science community mining of disease-specific proteins as biomarkers, which are often hidden by high complexity of the samples and/or small abundance. Nowadays, there are several proteomics-based approaches to study the proteome. This chapter focuses on gold standard proteomics strategies and related issues towards candidate biomarker discovery, which may have diagnostic/prognostic as well as mechanistic utility.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Rhizobium freirei PRF 81 is employed in common bean commercial inoculants in Brazil, due to its outstanding efficiency in fixing nitrogen, competitiveness and tolerance to abiotic stresses. Among the environmental conditions faced by rhizobia in soils, acidity is perhaps the encountered most, especially in Brazil. So, we used proteomics based approaches to study the responses of PRF 81 to a low pH condition. R. freirei PRF 81 was grown in TY medium until exponential phase in two treatments: pH 6,8 and pH 4,8. Whole-cell proteins were extracted and separated by two-dimensional gel electrophoresis, using IPG-strips with pH range 4-7 and 12% polyacrilamide gels. The experiment was performed in triplicate. Protein spots were detected in the high-resolution digitized gel images and analyzed by Image Master 2D Platinum v 5.0 software. Relative volumes (%vol) of compared between the two conditions tested and were statistically evaluated (p ≤ 0.05). Even knowing that R. freirei PRF 81 can still grow in more acid conditions, pH 4.8 was chosen because didn´t affect significantly the bacterial growth kinetics, a factor that could compromise the analysis. Using a narrow pH range, the gel profiles displayed a better resolution and reprodutibility than using broader pH range. Spots were mostly concentrated between pH 5-7 and molecular masses between 17-95 kDa. From the six hundred well-defined spots analyzed, one hundred and sixty-three spots presented a significant change in % vol, indicating that the pH led to expressive changes in the proteome of R. freirei PRF 81. Of these, sixty-one were up-regulated and one hundred two was downregulated in pH 4.8 condition. Also, fourteen spots were only identified in the acid condition, while seven spots was exclusively detected in pH 6.8. Ninety-five differentially expressed spots and two exclusively detected in pH 4,8 were selected for Maldi-Tof identification. Together with the genome sequencing and the proteome analysis of heat stress, we will search for molecular determinants of PRF 81 related to capacity to adapt to stressful tropical conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of oil wells drilling requires additional cares mainly if the drilling is in offshore ultra deep water with low overburden pressure gradients which cause low fracture gradients and, consequently, difficult the well drilling by the reduction of the operational window. To minimize, in the well planning phases, the difficulties faced by the drilling in those sceneries, indirect models are used to estimate fracture gradient that foresees approximate values for leakoff tests. These models generate curves of geopressures that allow detailed analysis of the pressure behavior for the whole well. Most of these models are based on the Terzaghi equation, just differentiating in the determination of the values of rock tension coefficient. This work proposes an alternative method for prediction of fracture pressure gradient based on a geometric correlation that relates the pressure gradients proportionally for a given depth and extrapolates it for the whole well depth, meaning that theses parameters vary in a fixed proportion. The model is based on the application of analytical proportion segments corresponding to the differential pressure related to the rock tension. The study shows that the proposed analytical proportion segments reaches values of fracture gradient with good agreement with those available for leakoff tests in the field area. The obtained results were compared with twelve different indirect models for fracture pressure gradient prediction based on the compacting effect. For this, a software was developed using Matlab language. The comparison was also made varying the water depth from zero (onshore wellbores) to 1500 meters. The leakoff tests are also used to compare the different methods including the one proposed in this work. The presented work gives good results for error analysis compared to other methods and, due to its simplicity, justify its possible application