Abstract
This paper proposes a data structure to perform forwarding lookups at Gigabit speed by condensing the routing table of backbone routers into cache, thus eliminating slow memory accesses. The proposed structure bases on the conventional trie, known to be good for partial string searches. For the longest IP matching lookups, its each level denotes a segment of IP address, and a node has multiple links each of which represents one combination of the address segment assigned to that level. When a given IP address reaches the dead-end node by following the links matching the IP address segment-by-segment, the node points the routing entry for forwarding the packet. For the size reduction, the trie compresses pointers of child's locations and routing entries as a bit array where a single bit encodes a pointer. So, we call the proposed structure bit-map (BM) trie. For better performance, the BM trie jumps to the appropriate node at the middle level rather than starts from the root node. The experiments show that it compacts backbone routers' tables into 512Kbyte cache and accomplishes around 2.4 million lookups per second on Pentium II processor.
Original language | English |
---|---|
Pages | 1872-1876 |
Number of pages | 5 |
State | Published - 2001 |
Event | IEEE Global Telecommunicatins Conference GLOBECOM'01 - San Antonio, TX, United States Duration: 25 Nov 2001 → 29 Nov 2001 |
Conference
Conference | IEEE Global Telecommunicatins Conference GLOBECOM'01 |
---|---|
Country/Territory | United States |
City | San Antonio, TX |
Period | 25/11/01 → 29/11/01 |