A client side WWW prefetching model

Abdullah Balamash, Marwan M Krunz

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Network congestion remains one of the main barriers to the continuing success of the Internet. For web users, congestion manifests itself in unacceptably long response times. One possible remedy to the latency problem is to use caching at the client, the proxy server, or even within the Internet. However, WWW documents are becoming increasingly dynamic (i.e., have short, lifetimes), which limits the potential benefit of caching. The performance of a WWW caching system can be dramatically increased by integrating document prefetching (a.k.a., "proactive caching") into its design. While prefetching reduces the perceived user response time, it also increases network load, which in turn may increase the response time. In this study, we investigate this tradeoff through a mathematical model of a WWW caching/prefetching system. In our model, the client cache is divided into a "regular" cache for on-demand requests and a "prefetching cache" for prefetched requests. A set of such clients connect to a proxy server through bandwidth-limited dedicated lines (e.g., dialup phone lines). The proxy server implements its own caching system. Forecasting of future documents is performed at the client based on the client access profile and hints from the servers. Our analysis sheds light on the interesting tradeoff between aggressive and conservative prefetching, and can be used to optimize the parameters of a combined caching/prefetching system.

Original languageEnglish (US)
Title of host publicationGLOBECOM - IEEE Global Telecommunications Conference
Pages948-952
Number of pages5
Volume2
StatePublished - 2004
EventGLOBECOM'04 - IEEE Global Telecommunications Conference - Dallas, TX, United States
Duration: Nov 29 2004Dec 3 2004

Other

OtherGLOBECOM'04 - IEEE Global Telecommunications Conference
CountryUnited States
CityDallas, TX
Period11/29/0412/3/04

Fingerprint

World Wide Web
Servers
Internet
Mathematical models
Bandwidth

Keywords

  • Prefetching
  • Proxy caching
  • Web caching
  • Web server
  • WWW modeling

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Balamash, A., & Krunz, M. M. (2004). A client side WWW prefetching model. In GLOBECOM - IEEE Global Telecommunications Conference (Vol. 2, pp. 948-952). [GE10-1]

A client side WWW prefetching model. / Balamash, Abdullah; Krunz, Marwan M.

GLOBECOM - IEEE Global Telecommunications Conference. Vol. 2 2004. p. 948-952 GE10-1.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Balamash, A & Krunz, MM 2004, A client side WWW prefetching model. in GLOBECOM - IEEE Global Telecommunications Conference. vol. 2, GE10-1, pp. 948-952, GLOBECOM'04 - IEEE Global Telecommunications Conference, Dallas, TX, United States, 11/29/04.
Balamash A, Krunz MM. A client side WWW prefetching model. In GLOBECOM - IEEE Global Telecommunications Conference. Vol. 2. 2004. p. 948-952. GE10-1
Balamash, Abdullah ; Krunz, Marwan M. / A client side WWW prefetching model. GLOBECOM - IEEE Global Telecommunications Conference. Vol. 2 2004. pp. 948-952
@inproceedings{bd793a7710e34c76aabdb80080dbdfb1,
title = "A client side WWW prefetching model",
abstract = "Network congestion remains one of the main barriers to the continuing success of the Internet. For web users, congestion manifests itself in unacceptably long response times. One possible remedy to the latency problem is to use caching at the client, the proxy server, or even within the Internet. However, WWW documents are becoming increasingly dynamic (i.e., have short, lifetimes), which limits the potential benefit of caching. The performance of a WWW caching system can be dramatically increased by integrating document prefetching (a.k.a., {"}proactive caching{"}) into its design. While prefetching reduces the perceived user response time, it also increases network load, which in turn may increase the response time. In this study, we investigate this tradeoff through a mathematical model of a WWW caching/prefetching system. In our model, the client cache is divided into a {"}regular{"} cache for on-demand requests and a {"}prefetching cache{"} for prefetched requests. A set of such clients connect to a proxy server through bandwidth-limited dedicated lines (e.g., dialup phone lines). The proxy server implements its own caching system. Forecasting of future documents is performed at the client based on the client access profile and hints from the servers. Our analysis sheds light on the interesting tradeoff between aggressive and conservative prefetching, and can be used to optimize the parameters of a combined caching/prefetching system.",
keywords = "Prefetching, Proxy caching, Web caching, Web server, WWW modeling",
author = "Abdullah Balamash and Krunz, {Marwan M}",
year = "2004",
language = "English (US)",
volume = "2",
pages = "948--952",
booktitle = "GLOBECOM - IEEE Global Telecommunications Conference",

}

TY - GEN

T1 - A client side WWW prefetching model

AU - Balamash, Abdullah

AU - Krunz, Marwan M

PY - 2004

Y1 - 2004

N2 - Network congestion remains one of the main barriers to the continuing success of the Internet. For web users, congestion manifests itself in unacceptably long response times. One possible remedy to the latency problem is to use caching at the client, the proxy server, or even within the Internet. However, WWW documents are becoming increasingly dynamic (i.e., have short, lifetimes), which limits the potential benefit of caching. The performance of a WWW caching system can be dramatically increased by integrating document prefetching (a.k.a., "proactive caching") into its design. While prefetching reduces the perceived user response time, it also increases network load, which in turn may increase the response time. In this study, we investigate this tradeoff through a mathematical model of a WWW caching/prefetching system. In our model, the client cache is divided into a "regular" cache for on-demand requests and a "prefetching cache" for prefetched requests. A set of such clients connect to a proxy server through bandwidth-limited dedicated lines (e.g., dialup phone lines). The proxy server implements its own caching system. Forecasting of future documents is performed at the client based on the client access profile and hints from the servers. Our analysis sheds light on the interesting tradeoff between aggressive and conservative prefetching, and can be used to optimize the parameters of a combined caching/prefetching system.

AB - Network congestion remains one of the main barriers to the continuing success of the Internet. For web users, congestion manifests itself in unacceptably long response times. One possible remedy to the latency problem is to use caching at the client, the proxy server, or even within the Internet. However, WWW documents are becoming increasingly dynamic (i.e., have short, lifetimes), which limits the potential benefit of caching. The performance of a WWW caching system can be dramatically increased by integrating document prefetching (a.k.a., "proactive caching") into its design. While prefetching reduces the perceived user response time, it also increases network load, which in turn may increase the response time. In this study, we investigate this tradeoff through a mathematical model of a WWW caching/prefetching system. In our model, the client cache is divided into a "regular" cache for on-demand requests and a "prefetching cache" for prefetched requests. A set of such clients connect to a proxy server through bandwidth-limited dedicated lines (e.g., dialup phone lines). The proxy server implements its own caching system. Forecasting of future documents is performed at the client based on the client access profile and hints from the servers. Our analysis sheds light on the interesting tradeoff between aggressive and conservative prefetching, and can be used to optimize the parameters of a combined caching/prefetching system.

KW - Prefetching

KW - Proxy caching

KW - Web caching

KW - Web server

KW - WWW modeling

UR - http://www.scopus.com/inward/record.url?scp=18144386488&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=18144386488&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:18144386488

VL - 2

SP - 948

EP - 952

BT - GLOBECOM - IEEE Global Telecommunications Conference

ER -