logoalt Hacker News

gowldyesterday at 4:05 PM2 repliesview on HN

More important is that the new algorithm has a multiplicative factor in m (edges), so it's only efficient for extremely sparse graphs.

If m > n (log n)^{1/3}

Then this algorithm is slower.

for 1 Million nodes, if the average degree is >3.5, the new algorithm has worse complexity (ignoring unstated constant factors)


Replies

usrusryesterday at 4:14 PM

"Any sufficiently sparse graph is indistinguishable from a linked list" comes to mind ;)

show 1 reply
bee_rideryesterday at 4:14 PM

Yeah, just based on this article that really stood out. It seems to be for a different use-case than Djikstra’s. An average degree of 3.5 seems like an extremely practical a useful use-case in real life, I just don’t see any reason to put it and Djikstra’s against each-other in a head-to-head comparison.