19 resultados para Large modeling projects

em Digital Commons at Florida International University


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Performance-based maintenance contracts differ significantly from material and method-based contracts that have been traditionally used to maintain roads. Road agencies around the world have moved towards a performance-based contract approach because it offers several advantages like cost saving, better budgeting certainty, better customer satisfaction with better road services and conditions. Payments for the maintenance of road are explicitly linked to the contractor successfully meeting certain clearly defined minimum performance indicators in these contracts. Quantitative evaluation of the cost of performance-based contracts has several difficulties due to the complexity of the pavement deterioration process. Based on a probabilistic analysis of failures of achieving multiple performance criteria over the length of the contract period, an effort has been made to develop a model that is capable of estimating the cost of these performance-based contracts. One of the essential functions of such model is to predict performance of the pavement as accurately as possible. Prediction of future degradation of pavement is done using Markov Chain Process, which requires estimating transition probabilities from previous deterioration rate for similar pavements. Transition probabilities were derived using historical pavement condition rating data, both for predicting pavement deterioration when there is no maintenance, and for predicting pavement improvement when maintenance activities are performed. A methodological framework has been developed to estimate the cost of maintaining road based on multiple performance criteria such as crack, rut and, roughness. The application of the developed model has been demonstrated via a real case study of Miami Dade Expressways (MDX) using pavement condition rating data from Florida Department of Transportation (FDOT) for a typical performance-based asphalt pavement maintenance contract. Results indicated that the pavement performance model developed could predict the pavement deterioration quite accurately. Sensitivity analysis performed shows that the model is very responsive to even slight changes in pavement deterioration rate and performance constraints. It is expected that the use of this model will assist the highway agencies and contractors in arriving at a fair contract value for executing long term performance-based pavement maintenance works.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Network simulation is an indispensable tool for studying Internet-scale networks due to the heterogeneous structure, immense size and changing properties. It is crucial for network simulators to generate representative traffic, which is necessary for effectively evaluating next-generation network protocols and applications. With network simulation, we can make a distinction between foreground traffic, which is generated by the target applications the researchers intend to study and therefore must be simulated with high fidelity, and background traffic, which represents the network traffic that is generated by other applications and does not require significant accuracy. The background traffic has a significant impact on the foreground traffic, since it competes with the foreground traffic for network resources and therefore can drastically affect the behavior of the applications that produce the foreground traffic. This dissertation aims to provide a solution to meaningfully generate background traffic in three aspects. First is realism. Realistic traffic characterization plays an important role in determining the correct outcome of the simulation studies. This work starts from enhancing an existing fluid background traffic model by removing its two unrealistic assumptions. The improved model can correctly reflect the network conditions in the reverse direction of the data traffic and can reproduce the traffic burstiness observed from measurements. Second is scalability. The trade-off between accuracy and scalability is a constant theme in background traffic modeling. This work presents a fast rate-based TCP (RTCP) traffic model, which originally used analytical models to represent TCP congestion control behavior. This model outperforms other existing traffic models in that it can correctly capture the overall TCP behavior and achieve a speedup of more than two orders of magnitude over the corresponding packet-oriented simulation. Third is network-wide traffic generation. Regardless of how detailed or scalable the models are, they mainly focus on how to generate traffic on one single link, which cannot be extended easily to studies of more complicated network scenarios. This work presents a cluster-based spatio-temporal background traffic generation model that considers spatial and temporal traffic characteristics as well as their correlations. The resulting model can be used effectively for the evaluation work in network studies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Performance-based maintenance contracts differ significantly from material and method-based contracts that have been traditionally used to maintain roads. Road agencies around the world have moved towards a performance-based contract approach because it offers several advantages like cost saving, better budgeting certainty, better customer satisfaction with better road services and conditions. Payments for the maintenance of road are explicitly linked to the contractor successfully meeting certain clearly defined minimum performance indicators in these contracts. Quantitative evaluation of the cost of performance-based contracts has several difficulties due to the complexity of the pavement deterioration process. Based on a probabilistic analysis of failures of achieving multiple performance criteria over the length of the contract period, an effort has been made to develop a model that is capable of estimating the cost of these performance-based contracts. One of the essential functions of such model is to predict performance of the pavement as accurately as possible. Prediction of future degradation of pavement is done using Markov Chain Process, which requires estimating transition probabilities from previous deterioration rate for similar pavements. Transition probabilities were derived using historical pavement condition rating data, both for predicting pavement deterioration when there is no maintenance, and for predicting pavement improvement when maintenance activities are performed. A methodological framework has been developed to estimate the cost of maintaining road based on multiple performance criteria such as crack, rut and, roughness. The application of the developed model has been demonstrated via a real case study of Miami Dade Expressways (MDX) using pavement condition rating data from Florida Department of Transportation (FDOT) for a typical performance-based asphalt pavement maintenance contract. Results indicated that the pavement performance model developed could predict the pavement deterioration quite accurately. Sensitivity analysis performed shows that the model is very responsive to even slight changes in pavement deterioration rate and performance constraints. It is expected that the use of this model will assist the highway agencies and contractors in arriving at a fair contract value for executing long term performance-based pavement maintenance works.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Network simulation is an indispensable tool for studying Internet-scale networks due to the heterogeneous structure, immense size and changing properties. It is crucial for network simulators to generate representative traffic, which is necessary for effectively evaluating next-generation network protocols and applications. With network simulation, we can make a distinction between foreground traffic, which is generated by the target applications the researchers intend to study and therefore must be simulated with high fidelity, and background traffic, which represents the network traffic that is generated by other applications and does not require significant accuracy. The background traffic has a significant impact on the foreground traffic, since it competes with the foreground traffic for network resources and therefore can drastically affect the behavior of the applications that produce the foreground traffic. This dissertation aims to provide a solution to meaningfully generate background traffic in three aspects. First is realism. Realistic traffic characterization plays an important role in determining the correct outcome of the simulation studies. This work starts from enhancing an existing fluid background traffic model by removing its two unrealistic assumptions. The improved model can correctly reflect the network conditions in the reverse direction of the data traffic and can reproduce the traffic burstiness observed from measurements. Second is scalability. The trade-off between accuracy and scalability is a constant theme in background traffic modeling. This work presents a fast rate-based TCP (RTCP) traffic model, which originally used analytical models to represent TCP congestion control behavior. This model outperforms other existing traffic models in that it can correctly capture the overall TCP behavior and achieve a speedup of more than two orders of magnitude over the corresponding packet-oriented simulation. Third is network-wide traffic generation. Regardless of how detailed or scalable the models are, they mainly focus on how to generate traffic on one single link, which cannot be extended easily to studies of more complicated network scenarios. This work presents a cluster-based spatio-temporal background traffic generation model that considers spatial and temporal traffic characteristics as well as their correlations. The resulting model can be used effectively for the evaluation work in network studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The subject of dropout prevention/reduction is deservedly receiving attention as a problem that, if not resolved, could threaten our national future.^ This study investigates a small segment of the overall dropout problem, which has apparently unique features of program design and population selection. The evidence presented here should add to the knowledge bank of this complicated problem.^ Project Trio was one of a number of dropout prevention programs and activities which were conducted in Dade County school years 1984-85 and 1985-86, and it is here investigated longitudinally through the end of the 1987-88 school year. It involved 17 junior and senior high schools, and 27 programs, 10 the first year and 17 the second, with over 1,000 total students, who had been selected by the schools from a list of the "at risk" students provided by the district, and were divided approximately evenly into the classical research design of an experimental group and the control group, which following standard procedure was to take the regular school curriculum. No school had more than 25 students in either group.^ Each school modified the basic design of the project to accommodate the individual school characteristics and the perceived needs of their students; however all schools projects were to include some form of academic enhancement, counseling and career awareness study.^ The conclusion of this study was that the control group had a significantly lower dropout rate than the experimental group. Though impossible to make a certain determination of the reasons for this unexpected result, it appears from evidence presented that one cause may have been inadequate administration at the local level.^ This study was also a longitudinal investigation of the "at risk" population as a whole for the three and four year period, to determine if academic factors were present in records may be used to identify dropout proneness.^ A significant correlation was found between dropping out and various measures including scores on the Quality of School Life Instrument, attendance, grade point averages, mathematics grades, and overage in grade, important identifiers in selection for dropout prevention programs. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clusters are aggregations of atoms or molecules, generally intermediate in size between individual atoms and aggregates that are large enough to be called bulk matter. Clusters can also be called nanoparticles, because their size is on the order of nanometers or tens of nanometers. A new field has begun to take shape called nanostructured materials which takes advantage of these atom clusters. The ultra-small size of building blocks leads to dramatically different properties and it is anticipated that such atomically engineered materials will be able to be tailored to perform as no previous material could.^ The idea of ionized cluster beam (ICB) thin film deposition technique was first proposed by Takagi in 1972. It was based upon using a supersonic jet source to produce, ionize and accelerate beams of atomic clusters onto substrates in a vacuum environment. Conditions for formation of cluster beams suitable for thin film deposition have only recently been established following twenty years of effort. Zinc clusters over 1,000 atoms in average size have been synthesized both in our lab and that of Gspann. More recently, other methods of synthesizing clusters and nanoparticles, using different types of cluster sources, have come under development.^ In this work, we studied different aspects of nanoparticle beams. The work includes refinement of a model of the cluster formation mechanism, development of a new real-time, in situ cluster size measurement method, and study of the use of ICB in the fabrication of semiconductor devices.^ The formation process of the vaporized-metal cluster beam was simulated and investigated using classical nucleation theory and one dimensional gas flow equations. Zinc cluster sizes predicted at the nozzle exit are in good quantitative agreement with experimental results in our laboratory.^ A novel in situ real-time mass, energy and velocity measurement apparatus has been designed, built and tested. This small size time-of-flight mass spectrometer is suitable to be used in our cluster deposition systems and does not suffer from problems related to other methods of cluster size measurement like: requirement for specialized ionizing lasers, inductive electrical or electromagnetic coupling, dependency on the assumption of homogeneous nucleation, limits on the size measurement and non real-time capability. Measured ion energies using the electrostatic energy analyzer are in good accordance with values obtained from computer simulation. The velocity (v) is measured by pulsing the cluster beam and measuring the time of delay between the pulse and analyzer output current. The mass of a particle is calculated from m = (2E/v$\sp2).$ The error in the measured value of background gas mass is on the order of 28% of the mass of one N$\sb2$ molecule which is negligible for the measurement of large size clusters. This resolution in cluster size measurement is very acceptable for our purposes.^ Selective area deposition onto conducting patterns overlying insulating substrates was demonstrated using intense, fully-ionized cluster beams. Parameters influencing the selectivity are ion energy, repelling voltage, the ratio of the conductor to insulator dimension, and substrate thickness. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shape memory alloys are a special class of metals that can undergo large deformation yet still be able to recover their original shape through the mechanism of phase transformations. However, when they experience plastic slip, their ability to recover their original shape is reduced. This is due to the presence of dislocations generated by plastic flow that interfere with shape recovery through the shape memory effect and the superelastic effect. A one-dimensional model that captures the coupling between shape memory effect, the superelastic effect and plastic deformation is introduced. The shape memory alloy is assumed to have only 3 phases: austenite, positive variant martensite and negative variant martensite. If the SMA flows plastically, each phase will exhibit a dislocation field that permanently prevents a portion of it from being transformed back to other phases. Hence, less of the phase is available for subsequent phase transformations. A constitutive model was developed to depict this phenomena and simulate the effect of plasticity on both the shape memory effect and the superelastic effect in shape memory alloys. In addition, experimental tests were conducted to characterize the phenomenon in shape memory wire and superelastic wire. ^ The constitutive model was then implemented in within a finite element context as UMAT (User MATerial Subroutine) for the commercial finite element package ABAQUS. The model is phenomenological in nature and is based on the construction of stress-temperature phase diagram. ^ The model has been shown to be capable of capturing the qualitative and quantitative aspects of the coupling between plasticity and the shape memory effect and plasticity and the super elastic effect within acceptable limits. As a verification case a simple truss structure was built and tested and then simulated using the FEA constitutive model. The results where found to be close the experimental data. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past two decades, multi-agent systems (MAS) have emerged as a new paradigm for conceptualizing large and complex distributed software systems. A multi-agent system view provides a natural abstraction for both the structure and the behavior of modern-day software systems. Although there were many conceptual frameworks for using multi-agent systems, there was no well established and widely accepted method for modeling multi-agent systems. This dissertation research addressed the representation and analysis of multi-agent systems based on model-oriented formal methods. The objective was to provide a systematic approach for studying MAS at an early stage of system development to ensure the quality of design. ^ Given that there was no well-defined formal model directly supporting agent-oriented modeling, this study was centered on three main topics: (1) adapting a well-known formal model, predicate transition nets (PrT nets), to support MAS modeling; (2) formulating a modeling methodology to ease the construction of formal MAS models; and (3) developing a technique to support machine analysis of formal MAS models using model checking technology. PrT nets were extended to include the notions of dynamic structure, agent communication and coordination to support agent-oriented modeling. An aspect-oriented technique was developed to address the modularity of agent models and compositionality of incremental analysis. A set of translation rules were defined to systematically translate formal MAS models to concrete models that can be verified through the model checker SPIN (Simple Promela Interpreter). ^ This dissertation presents the framework developed for modeling and analyzing MAS, including a well-defined process model based on nested PrT nets, and a comprehensive methodology to guide the construction and analysis of formal MAS models.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The major objectives of this dissertation were to develop optimal spatial techniques to model the spatial-temporal changes of the lake sediments and their nutrients from 1988 to 2006, and evaluate the impacts of the hurricanes occurred during 1998–2006. Mud zone reduced about 10.5% from 1988 to 1998, and increased about 6.2% from 1998 to 2006. Mud areas, volumes and weight were calculated using validated Kriging models. From 1988 to 1998, mud thicknesses increased up to 26 cm in the central lake area. The mud area and volume decreased about 13.78% and 10.26%, respectively. From 1998 to 2006, mud depths declined by up to 41 cm in the central lake area, mud volume reduced about 27%. Mud weight increased up to 29.32% from 1988 to 1998, but reduced over 20% from 1998 to 2006. The reduction of mud sediments is likely due to re-suspension and redistribution by waves and currents produced by large storm events, particularly Hurricanes Frances and Jeanne in 2004 and Wilma in 2005. Regression, kriging, geographically weighted regression (GWR) and regression-kriging models have been calibrated and validated for the spatial analysis of the sediments TP and TN of the lake. GWR models provide the most accurate predictions for TP and TN based on model performance and error analysis. TP values declined from an average of 651 to 593 mg/kg from 1998 to 2006, especially in the lake’s western and southern regions. From 1988 to 1998, TP declined in the northern and southern areas, and increased in the central-western part of the lake. The TP weights increased about 37.99%–43.68% from 1988 to 1998 and decreased about 29.72%–34.42% from 1998 to 2006. From 1988 to 1998, TN decreased in most areas, especially in the northern and southern lake regions; western littoral zone had the biggest increase, up to 40,000 mg/kg. From 1998 to 2006, TN declined from an average of 9,363 to 8,926 mg/kg, especially in the central and southern regions. The biggest increases occurred in the northern lake and southern edge areas. TN weights increased about 15%–16.2% from 1988 to 1998, and decreased about 7%–11% from 1998 to 2006.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planning for complex ecosystem restoration projects involves integrating ecological modeling with analysis of performance trade-offs among restoration alternatives. The authors used the Everglades Landscape Model and Multi-Criteria Decision Analysis to explore the effect of simulated ecosystem performance, risk preferences, and criteria weights on the ranking of three alternatives to restoring overland sheet flow in the Everglades. The ecological model outputs included both hydrologic and water quality criteria. Results were scored in the decision analysis framework, highlighting the trade-offs between hydrologic restoration and water quality constraints. Given equal weighting of performance measures, the alternative with more homogenous sheet flow was preferred over other alternatives, despite evidence of some localized eutrophication risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Small-bodied fishes constitute an important assemblage in many wetlands. In wetlands that dry periodically except for small permanent waterbodies, these fishes are quick to respond to change and can undergo large fluctuations in numbers and biomasses. An important aspect of landscapes that are mixtures of marsh and permanent waterbodies is that high rates of biomass production occur in the marshes during flooding phases, while the permanent waterbodies serve as refuges for many biotic components during the dry phases. The temporal and spatial dynamics of the small fishes are ecologically important, as these fishes provide a crucial food base for higher trophic levels, such as wading birds. We develop a simple model that is analytically tractable, describing the main processes of the spatio-temporal dynamics of a population of small-bodied fish in a seasonal wetland environment, consisting of marsh and permanent waterbodies. The population expands into newly flooded areas during the wet season and contracts during declining water levels in the dry season. If the marsh dries completely during these times (a drydown), the fish need refuge in permanent waterbodies. At least three new and general conclusions arise from the model: (1) there is an optimal rate at which fish should expand into a newly flooding area to maximize population production; (2) there is also a fluctuation amplitude of water level that maximizes fish production, and (3) there is an upper limit on the number of fish that can reach a permanent waterbody during a drydown, no matter how large the marsh surface area is that drains into the waterbody. Because water levels can be manipulated in many wetlands, it is useful to have an understanding of the role of these fluctuations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Buildings and other infrastructures located in the coastal regions of the US have a higher level of wind vulnerability. Reducing the increasing property losses and causalities associated with severe windstorms has been the central research focus of the wind engineering community. The present wind engineering toolbox consists of building codes and standards, laboratory experiments, and field measurements. The American Society of Civil Engineers (ASCE) 7 standard provides wind loads only for buildings with common shapes. For complex cases it refers to physical modeling. Although this option can be economically viable for large projects, it is not cost-effective for low-rise residential houses. To circumvent these limitations, a numerical approach based on the techniques of Computational Fluid Dynamics (CFD) has been developed. The recent advance in computing technology and significant developments in turbulence modeling is making numerical evaluation of wind effects a more affordable approach. The present study targeted those cases that are not addressed by the standards. These include wind loads on complex roofs for low-rise buildings, aerodynamics of tall buildings, and effects of complex surrounding buildings. Among all the turbulence models investigated, the large eddy simulation (LES) model performed the best in predicting wind loads. The application of a spatially evolving time-dependent wind velocity field with the relevant turbulence structures at the inlet boundaries was found to be essential. All the results were compared and validated with experimental data. The study also revealed CFD's unique flow visualization and aerodynamic data generation capabilities along with a better understanding of the complex three-dimensional aerodynamics of wind-structure interactions. With the proper modeling that realistically represents the actual turbulent atmospheric boundary layer flow, CFD can offer an economical alternative to the existing wind engineering tools. CFD's easy accessibility is expected to transform the practice of structural design for wind, resulting in more wind-resilient and sustainable systems by encouraging optimal aerodynamic and sustainable structural/building design. Thus, this method will help ensure public safety and reduce economic losses due to wind perils.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Low-rise buildings are often subjected to high wind loads during hurricanes that lead to severe damage and cause water intrusion. It is therefore important to estimate accurate wind pressures for design purposes to reduce losses. Wind loads on low-rise buildings can differ significantly depending upon the laboratory in which they were measured. The differences are due in large part to inadequate simulations of the low-frequency content of atmospheric velocity fluctuations in the laboratory and to the small scale of the models used for the measurements. A new partial turbulence simulation methodology was developed for simulating the effect of low-frequency flow fluctuations on low-rise buildings more effectively from the point of view of testing accuracy and repeatability than is currently the case. The methodology was validated by comparing aerodynamic pressure data for building models obtained in the open-jet 12-Fan Wall of Wind (WOW) facility against their counterparts in a boundary-layer wind tunnel. Field measurements of pressures on Texas Tech University building and Silsoe building were also used for validation purposes. The tests in partial simulation are freed of integral length scale constraints, meaning that model length scales in such testing are only limited by blockage considerations. Thus the partial simulation methodology can be used to produce aerodynamic data for low-rise buildings by using large-scale models in wind tunnels and WOW-like facilities. This is a major advantage, because large-scale models allow for accurate modeling of architectural details, testing at higher Reynolds number, using greater spatial resolution of the pressure taps in high pressure zones, and assessing the performance of aerodynamic devices to reduce wind effects. The technique eliminates a major cause of discrepancies among measurements conducted in different laboratories and can help to standardize flow simulations for testing residential homes as well as significantly improving testing accuracy and repeatability. Partial turbulence simulation was used in the WOW to determine the performance of discontinuous perforated parapets in mitigating roof pressures. The comparisons of pressures with and without parapets showed significant reductions in pressure coefficients in the zones with high suctions. This demonstrated the potential of such aerodynamic add-on devices to reduce uplift forces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrated project delivery (IPD) method has recently emerged as an alternative to traditional delivery methods. It has the potential to overcome inefficiencies of traditional delivery methods by enhancing collaboration among project participants. Information and communication technology (ICT) facilitates IPD by effective management, processing and communication of information within and among organizations. While the benefits of IPD, and the role of ICT in realizing them, have been generally acknowledged, the US public construction sector is very slow in adopting IPD. The reasons are - lack of experience and inadequate understanding of IPD in public owner as confirmed by the results of the questionnaire survey conducted under this research study. The public construction sector should be aware of the value of IPD and should know the essentials for effective implementation of IPD principles - especially, they should be cognizant of the opportunities offered by advancements in ICT to realize this.^ In order to address the need an IPD Readiness Assessment Model (IPD-RAM) was developed in this research study. The model was designed with a goal to determine IPD readiness of a public owner organization considering selected IPD principles, and ICT levels, at which project functions were carried out. Subsequent analysis led to identification of possible improvements in ICTs that have the potential to increase IPD readiness scores. Termed as the gap identification, this process was used to formulate improvement strategies. The model had been applied to six Florida International University (FIU) construction projects (case studies). The results showed that the IPD readiness of the organization was considerably low and several project functions can be improved by using higher and/or advanced level ICT tools and methods. Feedbacks from a focus group comprised of FIU officials and an independent group of experts had been received at various stages of this research and had been utilized during development and implementation of the model. Focus group input was also helpful for validation of the model and its results. It was hoped that the model developed would be useful to construction owner organizations in order to assess their IPD readiness and to identify appropriate ICT improvement strategies.^