615 resultados para Unbalanced Bidding
Resumo:
Three multivariate statistical tools (principal component analysis, factor analysis, analysis discriminant) have been tested to characterize and model the sags registered in distribution substations. Those models use several features to represent the magnitude, duration and unbalanced grade of sags. They have been obtained from voltage and current waveforms. The techniques are tested and compared using 69 registers of sags. The advantages and drawbacks of each technique are listed
Resumo:
El trabajo constituye en primer lugar, un ejercicio importante de recopilación y sistematización de la regulación existente en materia de competencia, y en especial, de la colusión en licitaciones como una de las prácticas restrictivas de la Competencia. De igual forma, contiene un análisis microeconómico de un cartel, a efectos de explicar las razones que lo promueven y lo desestimulan. Luego de ello, se hace un análisis profundo de las normas específicas en la materia, contenidas en el Decreto 2153 de 1992 y la Ley 1474 de 2011 y de las acciones que eventualmente podrían iniciarse para combatirla, dependiendo del momento en que se encuentre la licitación y del bien jurídico que pretenda protegerse.
Resumo:
El interés de esta investigación es analizar los cambios en las políticas migratorias de Italia y Libia a partir del Tratado de amistad y cooperación firmado en 2008. Utilizando el concepto de securitización de Barry Buzan, se explican cuáles fueron las principales motivaciones para que ambos Estados tomaran la decisión de endurecer sus políticas migratorias para hacerle frente a la migración irregular. La securitización del tema de la migración se convirtió en el mecanismo principal del gobierno italiano para justificar el incumplimiento de acuerdos internacionales, dejando en un segundo plano la protecciónde los Derechos Humanos. Esta situación trae consigo altos costos humanitarios y pone en evidencia cómo Italia y Libia están tratando las nuevas amenazas como lo es la migración irregular en esta región.
Resumo:
Resumen: Este trabajo estudia los resultados en matemáticas y lenguaje de 32000 estudiantes en la prueba saber 11 del 2008, de la ciudad de Bogotá. Este análisis reconoce que los individuos se encuentran contenidos en barrios y colegios, pero no todos los individuos del mismo barrio asisten a la misma escuela y viceversa. Con el fin de modelar esta estructura de datos se utilizan varios modelos econométricos, incluyendo una regresión jerárquica multinivel de efectos cruzados. Nuestro objetivo central es identificar en qué medida y que condiciones del barrio y del colegio se correlacionan con los resultados educacionales de la población objetivo y cuáles características de los barrios y de los colegios están más asociadas al resultado en las pruebas. Usamos datos de la prueba saber 11, del censo de colegios c600, del censo poblacional del 2005 y de la policía metropolitana de Bogotá. Nuestras estimaciones muestran que tanto el barrio como el colegio están correlacionados con los resultados en las pruebas; pero el efecto del colegio parece ser mucho más fuerte que el del barrio. Las características del colegio que están más asociadas con el resultado en las pruebas son la educación de los profesores, la jornada, el valor de la pensión, y el contexto socio económico del colegio. Las características de los barrios más asociadas con el resultado en las pruebas son, la presencia de universitarios en la UPZ, un clúster de altos niveles de educación y nivel de crimen en el barrio que se correlaciona negativamente. Los resultados anteriores fueron hallados teniendo en cuenta controles familiares y personales.
Resumo:
El propósito de esta monografía es analizar la incidencia que ha tenido el modelo de autofinanciación incentivado por el Estado colombiano en el carácter público de la Universidad Nacional de Colombia en los últimos veinte años, a partir de la teoría de los bienes club. Se plantea desde el análisis teórico y factico una revisión a la normatividad de la Educación Superior en Colombia para explicar la forma en que el aumento desequilibrado de los recursos propios frente a los aportes de la nación en el presupuesto de la Universidad Nacional de Colombia, sumado a algunas problemáticas en los gastos de funcionamiento, ha perpetuado los niveles de exclusión en el acceso al bien, restringiendo de esta forma el carácter público del servicio que ha prestado esta institución en las últimas dos décadas.
Resumo:
ANTECEDENTES: El aislamiento de células fetales libres o ADN fetal en sangre materna abre una ventana de posibilidades diagnósticas no invasivas para patologías monogénicas y cromosómicas, además de permitir la identificación del sexo y del RH fetal. Actualmente existen múltiples estudios que evalúan la eficacia de estos métodos, mostrando resultados costo-efectivos y de menor riesgo que el estándar de oro. Este trabajo describe la evidencia encontrada acerca del diagnóstico prenatal no invasivo luego de realizar una revisión sistemática de la literatura. OBJETIVOS: El objetivo de este estudio fue reunir la evidencia que cumpla con los criterios de búsqueda, en el tema del diagnóstico fetal no invasivo por células fetales libres en sangre materna para determinar su utilidad diagnóstica. MÉTODOS: Se realizó una revisión sistemática de la literatura con el fin de determinar si el diagnóstico prenatal no invasivo por células fetales libres en sangre materna es efectivo como método de diagnóstico. RESULTADOS: Se encontraron 5,893 artículos que cumplían con los criterios de búsqueda; 67 cumplieron los criterios de inclusión: 49.3% (33/67) correspondieron a estudios de corte transversal, 38,8% (26/67) a estudios de cohortes y el 11.9% (8/67) a estudios casos y controles. Se obtuvieron resultados de sensibilidad, especificidad y tipo de prueba. CONCLUSIÓN: En la presente revisión sistemática, se evidencia como el diagnóstico prenatal no invasivo es una técnica feasible, reproducible y sensible para el diagnóstico fetal, evitando el riesgo de un diagnóstico invasivo.
Resumo:
Potential mining of the pliocene and quaternary geological formations for the aggregate production along the medium course of the Fluvil river is likely to be carried out through gravel pits. The most significant environmental impacts are envisaged to occur during site preparation and extraction of aggregate. Several environmental impacts types have been considered: variation of water table, reduction of soil and vegetation, development of unbalanced land-forms and deterioration of landscape
Resumo:
We use a simplified atmospheric general circulation model (AGCM) to investigate the response of the lower atmosphere to thermal perturbations in the lower stratosphere. The results show that generic heating of the lower stratosphere tends to weaken the sub-tropical jets and the tropospheric mean meridional circulations. The positions of the jets, and the extent of the Hadley cells, respond to the distribution of the stratospheric heating, with low latitude heating displacing them poleward, and uniform heating displacing them equatorward. The patterns of response to the low latitude heating are similar to those found to be associated with solar variability in previous observational data analysis, and to the effects of varying solar UV radiation in sophisticated AGCMs. In order to investigate the chain of causality involved in converting the stratospheric thermal forcing to a tropospheric climate signal we conduct an experiment which uses an ensemble of model spin-ups to analyse the time development of the response to an applied stratospheric perturbation. We find that the initial effect of the change in static stability at the tropopause is to reduce the eddy momentum flux convergence in this region. This is followed by a vertical transfer of the momentum forcing anomaly by an anomalous mean circulation to the surface, where it is partly balanced by surface stress anomalies. The unbalanced part drives the evolution of the vertically integrated zonal flow. We conclude that solar heating of the stratosphere may produce changes in the circulation of the troposphere even without any direct forcing below the tropopause. We suggest that the impact of the stratospheric changes on wave propagation is key to the mechanisms involved.
Resumo:
In this paper it is argued that rotational wind is not the best choice of leading control variable for variational data assimilation, and an alternative is suggested and tested. A rotational wind parameter is used in most global variational assimilation systems as a pragmatic way of approximately representing the balanced component of the assimilation increments. In effect, rotational wind is treated as a proxy for potential vorticity, but one that it is potentially not a good choice in flow regimes characterised by small Burger number. This paper reports on an alternative set of control variables which are based around potential vorticity. This gives rise to a new formulation of the background error covariances for the Met Office's variational assimilation system, which leads to flow dependency. It uses similar balance relationships to traditional schemes, but recognises the existence of unbalanced rotational wind which is used with a new anti-balance relationship. The new scheme is described and its performance is evaluated and compared to a traditional scheme using a sample of diagnostics.
Resumo:
Planning a project with proper considerations of all necessary factors and managing a project to ensure its successful implementation will face a lot of challenges. Initial stage in planning a project for bidding a project is costly, time consuming and usually with poor accuracy on cost and effort predictions. On the other hand, detailed information for previous projects may be buried in piles of archived documents which can be increasingly difficult to learn from the previous experiences. Project portfolio has been brought into this field aiming to improve the information sharing and management among different projects. However, the amount of information that could be shared is still limited to generic information. This paper, we report a recently developed software system COBRA to automatically generate a project plan with effort estimation of time and cost based on data collected from previous completed projects. To maximise the data sharing and management among different projects, we proposed a method of using product based planning from PRINCE2 methodology. (Automated Project Information Sharing and Management System -�COBRA) Keywords: project management, product based planning, best practice, PRINCE2
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
We examine the motion of the ground surface on the Soufriere Hills Volcano, Montserrat between 1998 and 2000 using radar interferometry (InSAR). To minimise the effects of variable atmospheric water vapour on the InSAR measurements we use independently-derived measurements of the radar path delay from six continuous GPS receivers. The surfaces providing a measurable inter-ferometric signal are those on pyroclastic flow deposits, mainly emplaced in 1997. Three types of surface motion can be discriminated. Firstly, the surfaces of thick, valley-filling deposits subsided at rates of 150-120 mm/year in the year after emplacement to 50-30 mm/year two years later. This must be due to contraction and settling effects during cooling. The second type is the near-field motion localised within about one kilometre of the dome. Both subsidence and uplift events are seen and though the former could be due to surface gravitational effects, the latter may reflect shallow (< 1 km) pressurisation effects within the conduit/dome. Far-field motions of the surface away from the deeply buried valleys are interpreted as crustal strains. Because the flux of magma to the surface stopped from March 1998 to November 1999 and then resumed from November 1999 through 2000, we use InSAR data from these two periods to test the crustal strain behaviour of three models of magma supply: open, depleting and unbalanced. The InSAR observations of strain gradients of 75-80 mm/year/krn uplift during the period of quiescence on the western side of the volcano are consistent with an unbalanced model in which magma supply into a crustal magma chamber continues during quiescence, raising chamber pressure that is then released upon resumption of effusion. GPS motion vectors agree qualitatively with the InSAR displacements but are of smaller magnitude. The discrepancy may be due to inaccurate compensation for atmospheric delays in the InSAR data. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
In the tender process, contractors often rely on subcontract and supply enquiries to calculate their bid prices. However, this integral part of the bidding process is not empirically articulated in the literature. Over 30 published materials on the tendering process of contractors that talk about enquiries were reviewed and found to be based mainly on experiential knowledge rather than systematic evidence. The empirical research here helps to describe the process of enquiries precisely, improve it in practice, and have some basis to support it in theory. Using a live participant observation case study approach, the whole tender process was shadowed in the offices of two of the top 20 UK civil engineering construction firms. This helped to investigate 15 research questions on how contractors enquire and obtain prices from subcontractors and suppliers. Forty-three subcontract enquiries and 18 supply enquiries were made across two different projects with average value of 7m. An average of 15 subcontract packages and seven supply packages was involved. Thus, two or three subcontractors or suppliers were invited to bid in each package. All enquiries were formulated by the estimator, with occasional involvement of three other personnel. Most subcontract prices were received in an average of 14 working days; and supply prices took five days. The findings show 10 main activities involved in processing enquiries and their durations, as well as wasteful practices associated with enquiries. Contractors should limit their enquiry invitations to a maximum of three per package, and optimize the waiting time for quotations in order to improve cost efficiency.