Figure 4 - uploaded by Jeffrey P. Kharoufeh
Content may be subject to copyright.
Average price and wind generation levels in the year 2012.

Average price and wind generation levels in the year 2012.

Source publication
Article
Full-text available
We consider the problem of dynamically controlling a 2-bus energy distribution network with energy storage capabilities. An operator seeks to dynamically adjust the amount of energy to charge to, or discharge from, energy storage devices in response to randomly-evolving demand, renewable supply and prices. The objective is to minimize the expected...

Contexts in source publication

Context 1
... the analysis that follows, P t , R 1 t , and R 2 t are assumed to be mutually independent random variables. Figure 4 depicts the average hourly wind generation at bus 1 and price levels (and associated 95% confidence intervals) for a 24-hour period. In this figure, hour 1 is midnight to 0100, hour 2 is 0100-0200, hour 3 is 0200-0300, and so forth. ...
Context 2
... this figure, hour 1 is midnight to 0100, hour 2 is 0100-0200, hour 3 is 0200-0300, and so forth. Examining Figure 4(a) closely, it is (a) Real-time hourly electricity prices. seen that the evening hours (hours 17 to 21) are the peak-price periods, while the off-peak price periods span the late night and early morning hours (hours 1 to 7). ...
Context 3
... variability in the hourly prices exhibits a similar trend. By contrast, as seen in Figure 4(b), wind power output is highest during the late night and early morning hours and is lowest in the afternoon (hours 12 to 16). ...

Similar publications

Article
Full-text available
With the large-scale establishment of cross-camera networks, edge computing plays an important role in real-time tasks with its abundant edge resources and flexible task offloading strategy. Conventional studies usually utilize cross-camera network topology and real-time task status to generate subtask offloading strategies. However, most existed a...

Citations

... Meanwhile, these two policies have been widely used to control the MG system. For example, Bhattacharya et al. (2018) took advantage of myopic policies to demonstrate significant cost savings in the energy storage management of MGs; Hamidi and Livani (2017) proposed a decentralized and myopic algorithm for the charging management of plug-in hybrid electric vehicles (PHEVs) in a distribution network. In terms of the lookahead policies, Palma-Behnke et al. (2013) employed a rolling horizon strategy to implement energy management in a MG; Amgai et al. (2014) developed an integrated lookahead control-based adaptive supervisory framework for autonomic power system applications; Zhu et al. (2010) proposed a novel multistep coordinated control approach based on lookahead policies for MG management. ...
Article
Full-text available
This paper presents an efficient data-driven building electricity management system that integrates a battery energy storage (BES) and photovoltaic panels to support decision-making capabilities. In this micro-grid (MG) system, solar panels and power grid supply the electricity to the building and the BES acts as a buffer to alleviate the uncertain effects of solar energy generation and the demands of the building. In this study, we formulate the problem as a Markov decision process and model the uncertainties in the MG system, using martingale model of forecast evolution method. To control the system, lookahead policies with deterministic/stochastic forecasts are implemented. In addition, wait-and-see, greedy and updated greedy policies are used to benchmark the performance of lookahead policies. Furthermore, by varying the charging/discharging rate, we obtain the different battery size Es and transmission line power capacity (Pmax) accordingly, and then we investigate how the different Es and Pmax affect the performance of control policies. The numerical experiments demonstrate that the lookahead policy with stochastic forecasts performs better than the lookahead policy with deterministic forecasts when the Es and Pmax are large enough, and the lookahead policies outperform the greedy and updated policies in all case studies.
... Using this formula, the above objective function can be calculated and the corresponding value can be obtained [13] . ...
... In the process of model design, the investment cost, operation cost and equipment maintenance cost [12][13] are taken as one of the main reference contents of the model, and the toughness enhancement technology is used to process the three parts of data. Under the premise of system decision-making permission, this constraint condition is compromised with the objective function, and the following formula is obtained [18] . ...
Article
Full-text available
At present, due to the abnormal objective function setting result of AC / DC distribution network energy storage depth planning method, the operation cost of power supply network has been increased frequently. Therefore, the energy storage depth planning method of AC / DC distribution network based on toughness enhancement technology is designed. The minimum peak valley difference, maximum load rate and minimum load change are taken as the energy storage depth planning objectives of AC / DC distribution network. The economic mathematical model of distribution network planning is constructed by using toughness enhancement technology and objective function to complete the in-depth planning of distribution network. So far, the energy storage depth planning method of AC / DC distribution network based on toughness enhancement technology has been completed. Simulation experiment has been constructed. Through the comparison, it can be seen that this method has better effect than the original method, possessing stronger control ability for operation components.
... Thus, in the last few years, research on developing energy storage control policy has received a lot of attention. Specifically, there are a few recent papers establishing structural properties on optimal storage operations policies in a variety of MDP settings [4][5][6]. Moreover, there is a growing literature that develops approximation-based on-line algorithms, such as Lyapunov optimization [7,8], approximate dynamic programming (ADP) based heuristic algorithm [9], and rule-based algorithm [10]. ...
Conference Paper
Full-text available
Considering the scenario where an end-user equipped with battery energy storage and solar photovoltaic panels participates in demand response program, this study intends to develop a proper control policy for energy storage operations to minimize end-user's electricity cost in response to time-varying power demand, intermittent renewable generation, and time-of-use electricity prices. Specifically, this study proposes a threshold-based control policy by determining static time-varying thresholds on the state of charge level using two-stage stochastic programming based on historical data. The proposed threshold-based control policy can be applied to energy storage operations by adjusting charging and discharging energy storage to ensure the threshold has the minimum state of charge level of energy storage. The numerical experiments are conducted to evaluate the performance of the proposed threshold-based control policy with real historical data and practical parameter settings. As a conclusion, the future research plan is suggested to improve the threshold-based control policy.
Article
Assuming that a residential electricity consumer is equipped with solar photovoltaic panels integrated with energy storage while participating in a demand response program with time-varying price, this study focuses on developing a proper control policy for energy storage operations to minimize consumer electricity cost. In particular, this study intends to develop a threshold-based control policy that is designed to adjust the energy storage levels by charging and discharging energy storage to ensure that the energy storage levels are bounded from below by the thresholds across discrete time periods. In this case, the thresholds will be derived so that consumers are able to avoid the peak electricity rate and utilize more solar power generation while meeting the electricity demand. Specifically, the set of rule constraints is developed to enforce logical conditions to energy storage operations under the proposed control policy and integrated with the two-stage stochastic program to find proper thresholds using a real historical data set. Once the thresholds are obtained by solving the proposed rule-constrained two-stage stochastic program, the proposed control policy can be implemented to control energy storage operations. Compared to the existing approaches, the proposed control policy has merits in terms of practical application. Numerical experiments conducted with various residential house data show that the proposed threshold-based control policies result in 1%–4% gap compared to off-line optimal operations in terms of total energy cost for various residential house data. Also, compared to the reinforcement learning-based approaches, the proposed control policies show better performance with less computational time required to train the models.