963 resultados para Link variable method
Resumo:
With the increasing popularity of wireless network and its application, mobile ad-hoc networks (MANETS) emerged recently. MANET topology is highly dynamic in nature and nodes are highly mobile so that the rate of link failure is more in MANET. There is no central control over the nodes and the control is distributed among nodes and they can act as either router or source. MANTEs have been considered as isolated stand-alone network. Node can add or remove at any time and it is not infrastructure dependent. So at any time at any where the network can setup and a trouble free communication is possible. Due to more chances of link failures, collisions and transmission errors in MANET, the maintenance of network became costly. As per the study more frequent link failures became an important aspect of diminishing the performance of the network and also it is not predictable. The main objective of this paper is to study the route instability in AODV protocol and suggest a solution for improvement. This paper proposes a new approach to reduce the route failure by storing the alternate route in the intermediate nodes. In this algorithm intermediate nodes are also involved in the route discovery process. This reduces the route establishment overhead as well as the time to find the reroute when a link failure occurs.
Resumo:
In this paper a method of copy detection in short Malayalam text passages is proposed. Given two passages one as the source text and another as the copied text it is determined whether the second passage is plagiarized version of the source text. An algorithm for plagiarism detection using the n-gram model for word retrieval is developed and found tri-grams as the best model for comparing the Malayalam text. Based on the probability and the resemblance measures calculated from the n-gram comparison , the text is categorized on a threshold. Texts are compared by variable length n-gram(n={2,3,4}) comparisons. The experiments show that trigram model gives the average acceptable performance with affordable cost in terms of complexity
Resumo:
Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.
Resumo:
The consumers are becoming more concerned about food quality, especially regarding how, when and where the foods are produced (Haglund et al., 1999; Kahl et al., 2004; Alföldi, et al., 2006). Therefore, during recent years there has been a growing interest in the methods for food quality assessment, especially in the picture-development methods as a complement to traditional chemical analysis of single compounds (Kahl et al., 2006). The biocrystallization as one of the picture-developing method is based on the crystallographic phenomenon that when crystallizing aqueous solutions of dihydrate CuCl2 with adding of organic solutions, originating, e.g., from crop samples, biocrystallograms are generated with reproducible crystal patterns (Kleber & Steinike-Hartung, 1959). Its output is a crystal pattern on glass plates from which different variables (numbers) can be calculated by using image analysis. However, there is a lack of a standardized evaluation method to quantify the morphological features of the biocrystallogram image. Therefore, the main sakes of this research are (1) to optimize an existing statistical model in order to describe all the effects that contribute to the experiment, (2) to investigate the effect of image parameters on the texture analysis of the biocrystallogram images, i.e., region of interest (ROI), color transformation and histogram matching on samples from the project 020E170/F financed by the Federal Ministry of Food, Agriculture and Consumer Protection(BMELV).The samples are wheat and carrots from controlled field and farm trials, (3) to consider the strongest effect of texture parameter with the visual evaluation criteria that have been developed by a group of researcher (University of Kassel, Germany; Louis Bolk Institute (LBI), Netherlands and Biodynamic Research Association Denmark (BRAD), Denmark) in order to clarify how the relation of the texture parameter and visual characteristics on an image is. The refined statistical model was accomplished by using a lme model with repeated measurements via crossed effects, programmed in R (version 2.1.0). The validity of the F and P values is checked against the SAS program. While getting from the ANOVA the same F values, the P values are bigger in R because of the more conservative approach. The refined model is calculating more significant P values. The optimization of the image analysis is dealing with the following parameters: ROI(Region of Interest which is the area around the geometrical center), color transformation (calculation of the 1 dimensional gray level value out of the three dimensional color information of the scanned picture, which is necessary for the texture analysis), histogram matching (normalization of the histogram of the picture to enhance the contrast and to minimize the errors from lighting conditions). The samples were wheat from DOC trial with 4 field replicates for the years 2003 and 2005, “market samples”(organic and conventional neighbors with the same variety) for 2004 and 2005, carrot where the samples were obtained from the University of Kassel (2 varieties, 2 nitrogen treatments) for the years 2004, 2005, 2006 and “market samples” of carrot for the years 2004 and 2005. The criterion for the optimization was repeatability of the differentiation of the samples over the different harvest(years). For different samples different ROIs were found, which reflect the different pictures. The best color transformation that shows efficiently differentiation is relied on gray scale, i.e., equal color transformation. The second dimension of the color transformation only appeared in some years for the effect of color wavelength(hue) for carrot treated with different nitrate fertilizer levels. The best histogram matching is the Gaussian distribution. The approach was to find a connection between the variables from textural image analysis with the different visual criteria. The relation between the texture parameters and visual evaluation criteria was limited to the carrot samples, especially, as it could be well differentiated by the texture analysis. It was possible to connect groups of variables of the texture analysis with groups of criteria from the visual evaluation. These selected variables were able to differentiate the samples but not able to classify the samples according to the treatment. Contrarily, in case of visual criteria which describe the picture as a whole there is a classification in 80% of the sample cases possible. Herewith, it clearly can find the limits of the single variable approach of the image analysis (texture analysis).
Resumo:
This research is a study about knowledge interface that aims to analyse knowledge discontinuities, the dynamic and emergent characters of struggles and interactions within gender system and ethnicity differences. The cacao boom phenomenon in Central Sulawesi is the main context for a changing of social relations of production, especially when the mode of production has shifted or is still underway from subsistence to petty commodity production. This agrarian change is not only about a change of relationship and practice, but, as my previous research has shown, also about the shift of knowledge domination, because knowledge construes social practice in a dialectical process. Agroecological knowledge is accumulated through interaction, practice and experience. At the same time the knowledge gained from new practices and experiences changes mode of interaction, so such processes provide the arena where an interface of knowledge is manifested. In the process of agro-ecological knowledge interface, gender and ethnic group interactions materialise in the decision-making of production and resource allocation at the household and community level. At this point, power/knowledge is interplayed to gain authority in decision-making. When authority dominates, power encounters resistance, whereas the dominant power and its resistance are aimed to ensure socio-economic security. Eventually, the process of struggle can be identified through the pattern of resource utilisation as a realisation of production decision-making. Such processes are varied from one community to another, and therefore, it shows uniqueness and commonalities, especially when it is placed in a context of shifting mode of production. The focus is placed on actors: men and women in their institutional and cultural setting, including the role of development agents. The inquiry is informed by 4 major questions: 1) How do women and men acquire, disseminate, and utilise their agro ecological knowledge, specifically in rice farming as a subsistence commodity, as well as in cacao farming as a petty commodity? How and why do such mechanisms construct different knowledge domains between two genders? How does the knowledge mechanism apply in different ethnics? What are the implications for gender and ethnicity based relation of production? ; 2) Using the concept of valued knowledge in a shifting mode of production context: is there any knowledge that dominates others? How does the process of domination occur and why? Is there any form of struggle, strategies, negotiation, and compromise over this domination? How do these processes take place at a household as well as community level? How does it relate to production decision-making? ; 3) Putting the previous questions in two communities with a different point of arrival on a path of agricultural commercialisation, how do the processes of struggle vary? What are the bases of the commonalities and peculiarities in both communities?; 4) How the decisions of production affect rice field - cacao plantation - forest utilisation in the two villages? How does that triangle of resource use reflect the constellation of local knowledge in those two communities? What is the implication of this knowledge constellation for the cacao-rice-forest agroecosystem in the forest margin area? Employing a qualitative approach as the main method of inquiry, indepth and dialogic interviews, participant observer role, and document review are used to gather information. A small survey and children’s writing competition are supplementary to this data collection method. The later two methods are aimed to give wider information on household decision making and perception toward the forest. It was found that local knowledge, particularly knowledge pertaining to rice-forest-cacao agroecology is divided according to gender and ethnicity. This constellation places a process of decision-making as ‘the arena of interface’ between feminine and masculine knowledge, as well as between dominant and less dominant ethnic groups. Transition from subsistence to a commercial mode of production is a context that frames a process where knowledge about cacao commodity is valued higher than rice. Market mechanism, as an external power, defines valued knowledge. Valued knowledge defines the dominant knowledge holder, and decision. Therefore, cacao cultivation becomes a dominant practice. Its existence sacrifices the presence of rice field and the forest. Knowledge about rice production and forest ecosystem exist, but is less valued. So it is unable to challenge the domination of cacao. Various forms of struggles - within gender an ethnicity context - to resist cacao domination are an expression of unequal knowledge possession. Knowledge inequality implies to unequal access to withdraw benefit from market valued crop. When unequal knowledge fails to construct a negotiated field or struggles fail to reveal ‘marginal’ decision, e.g. intensification instead of cacao expansion to the forest, interface only produces divergence. Gender and ethnicity divided knowledge is unabridged, since negotiation is unable to produce new knowledge that accommodates both interests. Rice is loaded by ecological interest to conserve the forest, while cacao is driven by economic interest to increase welfare status. The implication of this unmediated dominant knowledge of cacao production is the construction of access; access to the forest, mainly to withdraw its economic benefit by eliminating its ecological benefit. Then, access to cacao as the social relationship of production to acquire cacao knowledge; lastly, access to defend sustainable benefit from cacao by expansion. ‘Socio-economic Security’ is defined by Access. The convergence of rice and cacao knowledge, however, should be made possible across gender and ethnicity, not only for the sake of forest conservation as the insurance of ecological security, but also for community’s socio-economic security. The convergence might be found in a range of alternative ways to conduct cacao sustainable production, from agroforestry system to intensification.
Resumo:
It is well known that regression analyses involving compositional data need special attention because the data are not of full rank. For a regression analysis where both the dependent and independent variable are components we propose a transformation of the components emphasizing their role as dependent and independent variables. A simple linear regression can be performed on the transformed components. The regression line can be depicted in a ternary diagram facilitating the interpretation of the analysis in terms of components. An exemple with time-budgets illustrates the method and the graphical features
Resumo:
In CoDaWork’05, we presented an application of discriminant function analysis (DFA) to 4 different compositional datasets and modelled the first canonical variable using a segmented regression model solely based on an observation about the scatter plots. In this paper, multiple linear regressions are applied to different datasets to confirm the validity of our proposed model. In addition to dating the unknown tephras by calibration as discussed previously, another method of mapping the unknown tephras into samples of the reference set or missing samples in between consecutive reference samples is proposed. The application of these methodologies is demonstrated with both simulated and real datasets. This new proposed methodology provides an alternative, more acceptable approach for geologists as their focus is on mapping the unknown tephra with relevant eruptive events rather than estimating the age of unknown tephra. Kew words: Tephrochronology; Segmented regression
Resumo:
In the B-ISDN there is a provision for four classes of services, all of them supported by a single transport network (the ATM network). Three of these services, the connected oriented (CO) ones, permit connection access control (CAC) but the fourth, the connectionless oriented (CLO) one, does not. Therefore, when CLO service and CO services have to share the same ATM link, a conflict may arise. This is because a bandwidth allocation to obtain maximum statistical gain can damage the contracted ATM quality of service (QOS); and vice versa, in order to guarantee the contracted QOS, the statistical gain have to be sacrificed. The paper presents a performance evaluation study of the influence of the CLO service on a CO service (a circuit emulation service or a variable bit-rate service) when sharing the same link
Resumo:
Experimental and comparative methods in the social sciences
Resumo:
Las estrategias de una empresa en un mercado pequeño donde hay pocos compradores y muchos vendedores se vuelven el centro y punto clave de éxito, de lo contrario la empresa solo sigue una tendencia de mercado que al mediano plazo no resultara efectiva y terminan desapareciendo. Es por ello que las estrategias de mercado en las empresas se vuelven la característica diferenciadora permitiendo tomar ventaja del hacinamiento de un sector. Para que una empresa sea exitosa debe tener en cuenta sus ventajas competitivas y potencializarlas al máximo según las condiciones del mercado en el que se desempeña, ya sean estas de estructura, recursos o talento humano. El estudio de caso pretende abordar las dinámicas de un mercado específico con tendencias particulares que definen la forma de competir y las costumbres de sus principales actores. Estas mismas tendencias fijan los modelos de negocio en el sector, que desde diversos análisis como el PESTEL o puntos de vista de autores como Kotler (1992) y sus estrategias competitivas según la participación de mercado o Miller (1986) y sus definiciones de mercados o Porter (1980) con sus fuerzas de mercado, lo que permite dar una guía o explicación del porqué de las situaciones particulares del mercado en un sector tan especifico como el de automatización y control de calidad.
Resumo:
Las organizaciones en la actualidad deben encontrar diferentes maneras de sobrevivir en un tiempo de rápida transformación. Uno de los mecanismos usados por las empresas para adaptarse a los cambios organizacionales son los sistemas de control de gestión, que a su vez permiten a las organizaciones hacer un seguimiento a sus procesos, para que la adaptabilidad sea efectiva. Otra variable importante para la adaptación es el aprendizaje organizacional siendo el proceso mediante el cual las organizaciones se adaptan a los cambios del entorno, tanto interno como externo de la compañía. Dado lo anterior, este proyecto se basa en la extracción de documentación soporte valido, que permita explorar las interacciones entre estos dos campos, los sistemas de control de gestión y el aprendizaje organizacional, además, analizar el impacto de estas interacciones en la perdurabilidad organizacional.
Resumo:
In image processing, segmentation algorithms constitute one of the main focuses of research. In this paper, new image segmentation algorithms based on a hard version of the information bottleneck method are presented. The objective of this method is to extract a compact representation of a variable, considered the input, with minimal loss of mutual information with respect to another variable, considered the output. First, we introduce a split-and-merge algorithm based on the definition of an information channel between a set of regions (input) of the image and the intensity histogram bins (output). From this channel, the maximization of the mutual information gain is used to optimize the image partitioning. Then, the merging process of the regions obtained in the previous phase is carried out by minimizing the loss of mutual information. From the inversion of the above channel, we also present a new histogram clustering algorithm based on the minimization of the mutual information loss, where now the input variable represents the histogram bins and the output is given by the set of regions obtained from the above split-and-merge algorithm. Finally, we introduce two new clustering algorithms which show how the information bottleneck method can be applied to the registration channel obtained when two multimodal images are correctly aligned. Different experiments on 2-D and 3-D images show the behavior of the proposed algorithms
Resumo:
A select-divide-and-conquer variational method to approximate configuration interaction (CI) is presented. Given an orthonormal set made up of occupied orbitals (Hartree-Fock or similar) and suitable correlation orbitals (natural or localized orbitals), a large N-electron target space S is split into subspaces S0,S1,S2,...,SR. S0, of dimension d0, contains all configurations K with attributes (energy contributions, etc.) above thresholds T0={T0egy, T0etc.}; the CI coefficients in S0 remain always free to vary. S1 accommodates KS with attributes above T1≤T0. An eigenproblem of dimension d0+d1 for S0+S 1 is solved first, after which the last d1 rows and columns are contracted into a single row and column, thus freezing the last d1 CI coefficients hereinafter. The process is repeated with successive Sj(j≥2) chosen so that corresponding CI matrices fit random access memory (RAM). Davidson's eigensolver is used R times. The final energy eigenvalue (lowest or excited one) is always above the corresponding exact eigenvalue in S. Threshold values {Tj;j=0, 1, 2,...,R} regulate accuracy; for large-dimensional S, high accuracy requires S 0+S1 to be solved outside RAM. From there on, however, usually a few Davidson iterations in RAM are needed for each step, so that Hamiltonian matrix-element evaluation becomes rate determining. One μhartree accuracy is achieved for an eigenproblem of order 24 × 106, involving 1.2 × 1012 nonzero matrix elements, and 8.4×109 Slater determinants
Resumo:
A simple extended finite field nuclear relaxation procedure for calculating vibrational contributions to degenerate four-wave mixing (also known as the intensity-dependent refractive index) is presented. As a by-product one also obtains the static vibrationally averaged linear polarizability, as well as the first and second hyperpolarizability. The methodology is validated by illustrative calculations on the water molecule. Further possible extensions are suggested
Resumo:
In the static field limit, the vibrational hyperpolarizability consists of two contributions due to: (1) the shift in the equilibrium geometry (known as nuclear relaxation), and (2) the change in the shape of the potential energy surface (known as curvature). Simple finite field methods have previously been developed for evaluating these static field contributions and also for determining the effect of nuclear relaxation on dynamic vibrational hyperpolarizabilities in the infinite frequency approximation. In this paper the finite field approach is extended to include, within the infinite frequency approximation, the effect of curvature on the major dynamic nonlinear optical processes