529 resultados para bootstrap


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A gestão das organizações não governamentais (ONG’s) apresenta dificuldades peculiares devido a falta de financiamento, o que dificulta a contratação de recursos humanos qualificados e a aquisição de ferramentas de gestão adequadas. Mesmo sem prosseguirem fins lucrativos, estas instituições procuram novas formas de gerir sua estrutura organizacional procurando atingir a missão e os objetivos definidos no seu planeamento. A ADECO - Associação para Defesa do Consumidor - é uma ONG de intervenção cívica e de solidariedade social na defesa dos legítimos interesses de todos os consumidores, e em particular, dos seus associados. Este trabalho propõe o desenvolvimento de um Sistema de Informação Web para a ADECO, que permita automatizar, simplificar e agilizar a informação através de duas vertentes: a pública e a privada. A vertente privada trata da gestão dos seus sócios e suas reclamações e a vertente pública do desenvolvimento de um Site para apresentação de informações, de notícias, e de áudios e vídeos dos programas transmitidos nos órgãos de comunicação social locais. O sistema foi desenvolvido na linguagem PHP, com o auxílio dos Framewoks Codeigniter e Bootstrap e a base de dados criada em MySql, utilizando o sistema de gestão de base de dados (SGBD), Phpmyadmin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este estudio presenta la validación de las observaciones que realizó el programa de observación pesquera llamado Programa Bitácoras de Pesca (PBP) durante el periodo 2005 - 2011 en el área de distribución donde operan las embarcaciones industriales de cerco dedicadas a la pesca del stock norte-centro de la anchoveta peruana (Engraulis ringens). Además, durante ese mismo periodo y área de distribución, se estimó la magnitud del descarte por exceso de captura, descarte de juveniles y la captura incidental de dicha pesquera. Se observaron 3 768 viajes de un total de 302 859, representando un porcentaje de 1.2 %. Los datos del descarte por exceso de captura, descarte de juveniles y captura incidental registrados en los viajes observados, se caracterizaron por presentar un alta proporción de ceros. Para la validación de las observaciones, se realizó un estudio de simulación basado en la metodología de Monte Carlo usando un modelo de distribución binomial negativo. Esta permite inferir sobre el nivel de cobertura óptima y conocer si la información obtenida en el programa de observación es contable. De este análisis, se concluye que los niveles de observación actual se deberían incrementar hasta tener un nivel de cobertura de al menos el 10% del total de viajes que realicen en el año las embarcaciones industriales de cerco dedicadas a la pesca del stock norte-centro de la anchoveta peruana. La estimación del descarte por exceso de captura, descarte de juveniles y captura incidental se realizó mediante tres metodologías: Bootstrap, Modelo General Lineal (GLM) y Modelo Delta. Cada metodología estimó distintas magnitudes con tendencias similares. Las magnitudes estimadas fueron comparadas usando un ANOVA Bayesiano, la cual muestra que hubo escasa evidencia que las magnitudes estimadas del descarte por exceso de captura por metodología sean diferentes, lo mismo se presentó para el caso de la captura incidental, mientras que para el descarte de juveniles mostró que hubieron diferencias sustanciales de ser diferentes. La metodología que cumplió los supuestos y explico la mayor variabilidad de las variables modeladas fue el Modelo Delta, el cual parece ser una mejor alternativa para la estimación, debido a la alta proporción de ceros en los datos. Las estimaciones promedio del descarte por exceso de captura, descarte de juveniles y captura incidental aplicando el Modelo Delta, fueron 252 580, 41 772, 44 823 toneladas respectivamente, que en conjunto representaron el 5.74% de los desembarques. Además, con la magnitud de la estimación del descarte de juveniles, se realizó un ejercicio de proyección de biomasa bajo el escenario hipotético de no mortalidad por pesca y que los individuos juveniles descartados sólo presentaron tallas de 8 y 11 cm., en la cual se obtuvo que la biomasa que no estará disponible a la pesca está entre los 52 mil y 93 mil toneladas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Se hace un análisis del estado poblacional del cangrejo violáceo Platyxanthus orbignyi (Milne Edwards y Lucas, 1843) del litoral de Lambayeque – Perú para el periodo 2001-2010 por medio de: 1) El modelo dinámico de biomasa de Schaefer en su versión de error de observación; a este modelo se le introdujo la variable ambiental anomalía de la temperatura superficial del mar (ATSM) del área de San José (Lambayeque) y se obtuvo 2) el modelo dinámico con variable ambiental, ambos basados en datos de captura, esfuerzo y CPUE. Se utilizó el método de máxima verosimilitud en el proceso de ajuste y el bootstrap para determinar los intervalos de confianza de los parámetros. Los parámetros poblacionales y pesqueros estimados por el modelo dinámico de biomasa de Schaefer (MDB) fueron: K: 750 000 kg, r : 0,21 y q: 8,36 x 10-6 y por el modelo dinámico con variable ambiental (MDVA) los parámetros fueron K: 765 000 kg, r: 0,23 y q: 8,02 x 10-6. Con los valores de los parámetros estimados mediante el MDB y el MDVA se calcularon los principales puntos biológicos de referencia (PBR) los cuales fueron: MRS: 39 822 kg, BMRS: 375 000 kg, fMRS: 12 561 nasas, FMRS: 0,11, F0.1: 0,10 para el MDB; y MRS: 44 069 kg, BMRS: 382 500 kg, fMRS: 13 782 nasas, FMRS: 0,12, F0.1: 0,10 para el MDVA. Los resultados indican que el estado actual de la pesquería del cangrejo violáceo del Litoral de Lambayeque se encuentra muy cerca al nivel óptimo. En vista de que no se dispone de información de evaluaciones directas de este recurso que confirme o no los resultados del MDB y MDVA y en virtud de la calidad de datos, se sugiere que el manejo de la pesquería sea del tipo adaptativo alrededor del punto de referencia F0.1 y teniendo en cuenta las condiciones ambientales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent decades the public sector comes under pressure in order to improve its performance. The use of Information Technology (IT) has been a tool increasingly used in reaching that goal. Thus, it has become an important issue in public organizations, particularly in institutions of higher education, determine which factors influence the acceptance and use of technology, impacting on the success of its implementation and the desired organizational results. The Technology Acceptance Model - TAM was used as the basis for this study and is based on the constructs perceived usefulness and perceived ease of use. However, when it comes to integrated management systems due to the complexity of its implementation,organizational factors were added to thus seek further explanation of the acceptance of such systems. Thus, added to the model five TAM constructs related to critical success factors in implementing ERP systems, they are: support of top management, communication, training, cooperation, and technological complexity (BUENO and SALMERON, 2008). Based on the foregoing, launches the following research problem: What factors influence the acceptance and use of SIE / module academic at the Federal University of Para, from the users' perception of teachers and technicians? The purpose of this study was to identify the influence of organizational factors, and behavioral antecedents of behavioral intention to use the SIE / module academic UFPA in the perspective of teachers and technical users. This is applied research, exploratory and descriptive, quantitative with the implementation of a survey, and data collection occurred through a structured questionnaire applied to a sample of 229 teachers and 30 technical and administrative staff. Data analysis was carried out through descriptive statistics and structural equation modeling with the technique of partial least squares (PLS). Effected primarily to assess the measurement model, which were verified reliability, convergent and discriminant validity for all indicators and constructs. Then the structural model was analyzed using the bootstrap resampling technique like. In assessing statistical significance, all hypotheses were supported. The coefficient of determination (R ²) was high or average in five of the six endogenous variables, so the model explains 47.3% of the variation in behavioral intention. It is noteworthy that among the antecedents of behavioral intention (BI) analyzed in this study, perceived usefulness is the variable that has a greater effect on behavioral intention, followed by ease of use (PEU) and attitude (AT). Among the organizational aspects (critical success factors) studied technological complexity (TC) and training (ERT) were those with greatest effect on behavioral intention to use, although these effects were lower than those produced by behavioral factors (originating from TAM). It is pointed out further that the support of senior management (TMS) showed, among all variables, the least effect on the intention to use (BI) and was followed by communications (COM) and cooperation (CO), which exert a low effect on behavioral intention (BI). Therefore, as other studies on the TAM constructs were adequate for the present research. Thus, the study contributed towards proving evidence that the Technology Acceptance Model can be applied to predict the acceptance of integrated management systems, even in public. Keywords: Technology

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En l’absence de mesure précise et unique de l’efficience pour les joueurs de hockey, la présente étude a pour objectifs d’évaluer l’efficience des joueurs dans la Ligue Nationale de Hockey (LNH) et de montrer comment celle-ci peut affecter la décision de racheter le contrat d’un joueur. Pour ce faire, les statistiques individuelles des joueurs de la LNH pour les saisons 2007-2008 à 2010-2011 sont utilisées. Pour estimer l’efficience, la méthode de l’enveloppement de données (DEA) avec bootstrap est utilisée. Les inputs incluent le salaire et le nombre de minutes de jeu, alors que les outputs incluent la contribution défensive et offensive de chaque joueur. Pour estimer l’association entre l’efficience individuelle et la probabilité d’un rachat de contrat, une régression logistique est utilisée. L’analyse des données montre que parmi 3 159 observations, l’efficience moyenne est de 0,635. L’efficience moyenne est similaire pour toutes les positions et toutes les saisons. Un lien positif et fort est trouvé entre le nombre de points au classement général d’une équipe et l’efficience moyenne des joueurs qui la compose (coefficient de corrélation=0,43, valeur-p<0,01). Les joueurs avec une efficience plus élevée ont une probabilité plus faible de voir leur contrat racheté (rapport des chances=0,01, valeur-p<0,01). La présente étude conclut donc que la plupart des joueurs de hockey dans la LNH ont un degré d’inefficience non négligeable, qu’une efficience plus élevée est associée à une meilleure performance au niveau de l’équipe et que les joueurs efficients ont une probabilité plus faible de voir leur contrat racheté.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El modelo geométrico del geoide MGH44 es el resultado de una comparación directa entre mediciones GPS y de nivelación convencional sobre puntos de una red geodésica ubicada en la zona urbana de 50 km2 de la ciudad de Heredia, Costa Rica. Con la grilla del MGH44 se obtiene la ondulación del geoide para cualquier punto de esa zona, valor que se puede utilizar para estimar la altura sobre el nivel medio del mar a partir de mediciones de altura elipsoídica con GPS.En este documento se describen los procedimientosy cálculos realizados para evaluar la calidad vertical del modelo MGH44 por medio de la aplicación del estándar de la National Standard for Spatial Data Accuracy (NSSDA). A través de la generación de una nueva grilla, con solo 36 datos denominada MGH36, se obtuvieron nuevos valores de la ondulación del geoide para los restantes 20 puntos escogidos como control. En el procesamiento de la información se aplicaron diferentes algoritmos para corroborar si los datos de los 20 puntos de control siguen una distribución normal y, además, verificar que en este conjunto no se tuvieran errores groseros. El valor promedio de la ondulación del geoide de los puntos de control es de 14,287 m y el cálculo según el estándar de la NSSDA brindó una exactitud vertical de los datos de ± 0,045 m. Posteriormente,por medio de la técnica de Bootstrap, se calcularon con un 95% de probabilidad los valores 14,233 m y 14,353 m como límites del intervalo de confianza del promedio.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the development and evaluation of PICTOAPRENDE, which is an interactive software designed to improve oral communication. Additionally, it contributes to the development of children and youth who are diagnosed with autism spectrum disorder (ASD) in Ecuador. To fulfill this purpose initially analyzes the intervention area where the general characteristics of people with ASD and their status in Ecuador is described. Statistical techniques used for this evaluation constitutes the basis of this study. A section that presents the development of research-based cognitive and social parameters of the area of intervention is also shown. Finally, the algorithms to obtain the measurements and experimental results along with the analysis of them are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Face à l’augmentation observée des accidents de régénération en forêt boréale et leur impact sur la productivité et la résilience des peuplements denses d’épinette noire, une meilleure compréhension des mécanismes de résilience et une surveillance des risques d’accident de régénération sont nécessaires. L’objectif principal de cette étude visait donc le développement de modèles prédictifs et spatialement explicites de la régénération de l’épinette noire. Plus particulièrement, deux modèles ont été développés soit (1) un modèle théorique, développé à l’aide de données in situ et de données spatiales et (2) un modèle cartographique, utilisant uniquement des données spatiales accessibles telles que les inventaires forestiers provinciaux et l’indice spectral de sévérité des feux « differenced Normalized Burn Ratio » (dNBR). Les résultats obtenus ont permis de constater que la succession rapprochée (< 55 ans) d’une coupe et d’un feu n’entraîne pas automatiquement une ouverture des peuplements d’épinette noire. Tout d’abord, les peuplements affectés par la coupe de récupération de brûlis (1963), immatures lors du feu de 2005, sont caractérisés par une faible régénération. En contrepartie, la régénération à la suite du feu de 2005, observé dans les peuplements coupés entre 1948 et 1967, est similaire à celle observée dans les peuplements non perturbés dans les 60 années précédant le feu. Le modèle théorique sélectionné à l’aide des critères d’information d’Akaike a, quant à lui, permis d'identifier trois variables déterminantes dans le succès ou l’échec de la régénération de l’épinette noire soit (1) la végétation potentielle, (2) le pourcentage de recouvrement du sol par les sphaignes et (3) la sévérité du feu évaluée à l’aide du dNBR. Des validations bootstrap et croisée ont permis de mettre en évidence qu’un modèle utilisant ces trois variables explique 59 % de la variabilité de la régénération observée dans le territoire d’étude., Quant à lui, le modèle cartographique qui utilise uniquement les variables végétation potentielle et dNBR explique 32 % de la variabilité. Finalement ce modèle a permis la création d’une carte de risque d’accident de régénération. Basée sur la précision du modèle, cette carte offre un potentiel intéressant afin de cibler les secteurs les plus à risque et ainsi appuyer les décisions relatives aux reboisements dans les zones incendiées.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many exchange rate papers articulate the view that instabilities constitute a major impediment to exchange rate predictability. In this thesis we implement Bayesian and other techniques to account for such instabilities, and examine some of the main obstacles to exchange rate models' predictive ability. We first consider in Chapter 2 a time-varying parameter model in which fluctuations in exchange rates are related to short-term nominal interest rates ensuing from monetary policy rules, such as Taylor rules. Unlike the existing exchange rate studies, the parameters of our Taylor rules are allowed to change over time, in light of the widespread evidence of shifts in fundamentals - for example in the aftermath of the Global Financial Crisis. Focusing on quarterly data frequency from the crisis, we detect forecast improvements upon a random walk (RW) benchmark for at least half, and for as many as seven out of 10, of the currencies considered. Results are stronger when we allow the time-varying parameters of the Taylor rules to differ between countries. In Chapter 3 we look closely at the role of time-variation in parameters and other sources of uncertainty in hindering exchange rate models' predictive power. We apply a Bayesian setup that incorporates the notion that the relevant set of exchange rate determinants and their corresponding coefficients, change over time. Using statistical and economic measures of performance, we first find that predictive models which allow for sudden, rather than smooth, changes in the coefficients yield significant forecast improvements and economic gains at horizons beyond 1-month. At shorter horizons, however, our methods fail to forecast better than the RW. And we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients variability to incorporate in the models, as the main factors obstructing predictive ability. Chapter 4 focus on the problem of the time-varying predictive ability of economic fundamentals for exchange rates. It uses bootstrap-based methods to uncover the time-specific conditioning information for predicting fluctuations in exchange rates. Employing several metrics for statistical and economic evaluation of forecasting performance, we find that our approach based on pre-selecting and validating fundamentals across bootstrap replications generates more accurate forecasts than the RW. The approach, known as bumping, robustly reveals parsimonious models with out-of-sample predictive power at 1-month horizon; and outperforms alternative methods, including Bayesian, bagging, and standard forecast combinations. Chapter 5 exploits the predictive content of daily commodity prices for monthly commodity-currency exchange rates. It builds on the idea that the effect of daily commodity price fluctuations on commodity currencies is short-lived, and therefore harder to pin down at low frequencies. Using MIxed DAta Sampling (MIDAS) models, and Bayesian estimation methods to account for time-variation in predictive ability, the chapter demonstrates the usefulness of suitably exploiting such short-lived effects in improving exchange rate forecasts. It further shows that the usual low-frequency predictors, such as money supplies and interest rates differentials, typically receive little support from the data at monthly frequency, whereas MIDAS models featuring daily commodity prices are highly likely. The chapter also introduces the random walk Metropolis-Hastings technique as a new tool to estimate MIDAS regressions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Internet has grown in size at rapid rates since BGP records began, and continues to do so. This has raised concerns about the scalability of the current BGP routing system, as the routing state at each router in a shortest-path routing protocol will grow at a supra-linearly rate as the network grows. The concerns are that the memory capacity of routers will not be able to keep up with demands, and that the growth of the Internet will become ever more cramped as more and more of the world seeks the benefits of being connected. Compact routing schemes, where the routing state grows only sub-linearly relative to the growth of the network, could solve this problem and ensure that router memory would not be a bottleneck to Internet growth. These schemes trade away shortest-path routing for scalable memory state, by allowing some paths to have a certain amount of bounded “stretch”. The most promising such scheme is Cowen Routing, which can provide scalable, compact routing state for Internet routing, while still providing shortest-path routing to nearly all other nodes, with only slightly stretched paths to a very small subset of the network. Currently, there is no fully distributed form of Cowen Routing that would be practical for the Internet. This dissertation describes a fully distributed and compact protocol for Cowen routing, using the k-core graph decomposition. Previous compact routing work showed the k-core graph decomposition is useful for Cowen Routing on the Internet, but no distributed form existed. This dissertation gives a distributed k-core algorithm optimised to be efficient on dynamic graphs, along with with proofs of its correctness. The performance and efficiency of this distributed k-core algorithm is evaluated on large, Internet AS graphs, with excellent results. This dissertation then goes on to describe a fully distributed and compact Cowen Routing protocol. This protocol being comprised of a landmark selection process for Cowen Routing using the k-core algorithm, with mechanisms to ensure compact state at all times, including at bootstrap; a local cluster routing process, with mechanisms for policy application and control of cluster sizes, ensuring again that state can remain compact at all times; and a landmark routing process is described with a prioritisation mechanism for announcements that ensures compact state at all times.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En este trabajo de fin de grado se ha desarrollado una aplicación de administración que sustituye a las que ofrecen por defecto las aplicaciones creadas con el framework de desarrollo web Django. La aplicación está compuesta por dos partes: un servidor, desarrollado con Node y Express, que ataca a la base de datos MySQL de la aplicación Django (es el nexo de unión entre ambas), y expone una API que es utilizada por la otra parte que compone la aplicación, la parte del cliente. La API es totalmente privada, siendo necesario un token de autenticación válido para poder obtener una respuesta satisfactoria de la misma. La generación del token también es tarea del servidor. El cliente, que es la parte que ve el usuario final, está desarrollada usando el framework Angular. La interfaz de usuario utiliza Bootstrap, por lo que su visualización es correcta en cualquier tipo de dispositivo, tanto de escritorio como móvil. En definitiva, se ha desarrollado una aplicación JavaScript End-to-End, empleando las últimas tecnologías web, mejorando ostensiblemente, las prestaciones que ofrece un panel de administración generado automáticamente por una aplicación Django.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El proyecto consiste en un portal de búsqueda de vulnerabilidades web, llamado Krashr, cuyo objetivo es el de buscar si una página web introducida por un usuario contiene algún tipo de vulnerabilidad explotable, además de tratar de ayudar a este usuario a arreglar las vulnerabilidades encontradas. Se cuenta con un back-end realizado en Python con una base de datos PostreSQL, un front-end web realizado en AngularJS y una API basada en Node.js y Express que comunica los dos frentes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Drag & Drop es una aplicación web diseñada para la creación de problemas a partir de piezas, en la que al profesor se le plantea una nueva posibilidad de evaluar a sus alumnos. La aplicación web servirá como un entorno dedicado a la elaboración de preguntas y respuestas. Para responder a dichas preguntas, se proporcionan unos elementos llamados “piezas” al alumno que se encargará de utilizar para construir su respuesta. A su vez, el profesor al elaborar la pregunta establecerá la solución ideal del problema y el conjunto de “piezas” que los alumnos podrán utilizar para crear las suyas propias. El alumno al terminar la solución de un problema, la enviará al servidor. Este se encargará de evaluarla y comparar la solución del alumno con la solución ideal propuesta por el profesor.Finalmente el profesor será el encargado de examinar el ejercicio y ajustar la calificación, ya sea aceptando la que propone el sistema o indicando una propia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent decades the public sector comes under pressure in order to improve its performance. The use of Information Technology (IT) has been a tool increasingly used in reaching that goal. Thus, it has become an important issue in public organizations, particularly in institutions of higher education, determine which factors influence the acceptance and use of technology, impacting on the success of its implementation and the desired organizational results. The Technology Acceptance Model - TAM was used as the basis for this study and is based on the constructs perceived usefulness and perceived ease of use. However, when it comes to integrated management systems due to the complexity of its implementation,organizational factors were added to thus seek further explanation of the acceptance of such systems. Thus, added to the model five TAM constructs related to critical success factors in implementing ERP systems, they are: support of top management, communication, training, cooperation, and technological complexity (BUENO and SALMERON, 2008). Based on the foregoing, launches the following research problem: What factors influence the acceptance and use of SIE / module academic at the Federal University of Para, from the users' perception of teachers and technicians? The purpose of this study was to identify the influence of organizational factors, and behavioral antecedents of behavioral intention to use the SIE / module academic UFPA in the perspective of teachers and technical users. This is applied research, exploratory and descriptive, quantitative with the implementation of a survey, and data collection occurred through a structured questionnaire applied to a sample of 229 teachers and 30 technical and administrative staff. Data analysis was carried out through descriptive statistics and structural equation modeling with the technique of partial least squares (PLS). Effected primarily to assess the measurement model, which were verified reliability, convergent and discriminant validity for all indicators and constructs. Then the structural model was analyzed using the bootstrap resampling technique like. In assessing statistical significance, all hypotheses were supported. The coefficient of determination (R ²) was high or average in five of the six endogenous variables, so the model explains 47.3% of the variation in behavioral intention. It is noteworthy that among the antecedents of behavioral intention (BI) analyzed in this study, perceived usefulness is the variable that has a greater effect on behavioral intention, followed by ease of use (PEU) and attitude (AT). Among the organizational aspects (critical success factors) studied technological complexity (TC) and training (ERT) were those with greatest effect on behavioral intention to use, although these effects were lower than those produced by behavioral factors (originating from TAM). It is pointed out further that the support of senior management (TMS) showed, among all variables, the least effect on the intention to use (BI) and was followed by communications (COM) and cooperation (CO), which exert a low effect on behavioral intention (BI). Therefore, as other studies on the TAM constructs were adequate for the present research. Thus, the study contributed towards proving evidence that the Technology Acceptance Model can be applied to predict the acceptance of integrated management systems, even in public. Keywords: Technology