(Analysis by Dhruv Rohatgi )

Let's unpack the problem statement. Given $N$ lattice points in a 2D grid, we want to find a maximum-length sequence of points $(x_i, y_i)_{i=0}^k$ such that

$$0 = x_0 < x_1 < \dots < x_k = T$$
and
$$0 = y_0 < y_1 < \dots < y_k = T.$$
Out of all such sequences, we want to find the minimum possible value of the following expression:
$$\sum_{i=1}^k (x_i - x_{i-1})(y_i - y_{i-1}).$$

Ignoring the second condition, observe that the maximum length $k$ (ignoring the initial point $(0,0)$) is the length of the Longest Increasing Subsequence. Let $l(x,y)$ be the length of the longest increasing subsequence ending at point $(x,y)$. Then the solution must satisfy $l(x_i, y_i) = i$ for all $1 \leq i \leq k$.

Let $L(i)$ be the set of points with $l(x,y) = i$; we refer to this set as the $i$-th level. For two points $p = (x_p, y_p)$ and $q = (x_q, y_q)$ we say that $p$ dominates $q$, and write $p \gg q$, if $p > q$ in both coordinates. And let $dp(p)$ be the minimum area of any sequence of rectangles of maximal length ending at $(x_p, y_p)$. Then for any $p \in L(i)$,

$$dp(p) = \min_{q \in L(i-1) \mid p \gg q} \left [ dp(q) + (x_p - x_q)(y_p - y_q) \right ].$$

Consider any $q_1, q_2 \in L(i-1)$ with $x_{q_1} < x_{q_2}$. Since $L(i-1)$ is a level, $y_{q_1} > y_{q_2}$. If $p_1, p_2 \in L(i)$ with $x_{p_1} < x_{p_2}$ and $p_1, p_2 \gg q_1, q_2$, it can be verified that if $q_1$ is "better" than $q_2$ for $p_1$, then $q_1$ is also "better" than $q_2$ for $p_2$.

In many dynamic programming problems, the above monotonicity observation suffices to speed up $O(N)$-time transitions to $O(\log N)$ or $O(1)$ amortized. In particular, suppose for a moment that the above DP transition did not have the condition $p \gg q$. Then the DP transitions could be computed efficiently by divide-and-conquer or a monotonic queue, since for any $q_1, q_2 \in L(i-1)$, as we sweep right through $L(i)$, eventually $q_1$ "overtakes" $q_2$ in being the better transition.

However, we must deal with the $p \gg q$ conditions. Each point $q \in L(i-1)$ has some interval of points in $L(i)$ which dominate it; call this $q$'s "interval of relevance". As $x_q$ increases, the corresponding interval monotonically shifts right. So the issue is that $q_1$ might "overtake" $q_2$ to become the best transition, but then $q_1$'s interval might end (before $q_2$'s interval has ended), and suddenly $q_2$ is once again the best interval!

Intuitively, the issue is that we cannot easily deal with intervals of relevance starting and ending in the middle of a sweep through $L(i)$. To avoid these events, we'd like to divide $L(i)$ into blocks so that intervals do not start or end within blocks. We cannot quite do this: if we divided $L(i)$ at every point where an interval started or ended, then every interval could intersect $O(N)$ blocks. But we can do something half as good: divide $L(i)$ into blocks so that no interval both begins and ends strictly inside a block, and no interval intersects more than $2$ blocks.

Let's suppose we have such a partition. How do we use it? Fix a block, and consider the intervals intersecting it. They come in two types: either the interval ends within the block, or it begins within the block---but crucially, not both. Using a monotonic queue, we can sweep right-to-left through the block and efficiently compute optimal transitions using only the points $q \in L(i-1)$ whose intervals of relevance end within the block. Separately, we can sweep left-to-right through the block and efficiently compute optimal transitions using only the points whose intervals start within the block. Then for each point in the block, its final DP value is the minimum of the two values computed. The key point which allows use of the monotonic queue is that points $q \in L(i-1)$ become relevant one by one as the sweep enters their intervals of relevance, but they never become irrelevant. The time complexity is $O((I+B) \log B)$, where $B$ is the size of the block and $I$ is the number of intersecting intervals; the $\log B$ factor is due to binary searches to determine, for various $q_1, q_2 \in L(i-1)$, where exactly $q_1$ "overtakes" $q_2$.

Since the blocks are disjoint and every interval intersects at most two blocks, the overall complexity of the transition from level $L(i-1)$ to level $L(i)$ is $O(N \log N)$, where $N$ is the total size of the two levels.

It remains to show how $L(i)$ can be partitioned into blocks with the desired properties. This is a simple consequence of the fact that the intervals of relevance are monotonic. Sweep left to right to right, and greedily construct each block to be as long as possible without strictly containing an interval. Obviously the first property is satisfied: by construction, no interval starts and ends strictly within a block. The second property is proved by contradiction. Suppose some interval intersects $3$ blocks; then it strictly contains one block. But by construction this block (non-strictly) contains some interval: otherwise it would be longer! So this interval is nested within the other interval, a contradiction. This proves that the desired partition can be found in linear time.

#include <iostream>
#include <algorithm>
#include <ctime>
#include <vector>
#include <cstdio>
using namespace std;
#define MAXN 300001
#define INF 1000000000000000000LL

int N, L;
int x[MAXN], y[MAXN];
int xid[MAXN];
int lis[MAXN];
long long dp[MAXN];
long long dpPlus[MAXN];

vector<int> levels[MAXN];	//lists of points in each LIS level set

bool cmpx(int a,int b)	//assume all points have distinct x-coordinate (and y-coordinate)
{
	return x[a] < x[b];
}

void computeLIS()
{
	levels[0].push_back(xid[0]);
	lis[xid[0]] = 0;
	int mx = 0;
	for(int i=1;i<N;i++)
	{
		int cur = xid[i];
		int low = -1;
		int high = mx;
		while(low != high)
		{
			int mid = (low+high+1)/2;
			if(y[levels[mid].back()] < y[cur]) low = mid;
			else high = mid-1;
		}
		levels[low+1].push_back(cur);
		mx = max(mx, low+1);
		lis[cur] = low+1;
	}
}

long long cost(int i,int j)
{
	return dp[i] + x[i]*((long long)y[i]) - x[i]*((long long)y[j]) - y[i]*((long long)x[j]) + x[j]*((long long)y[j]);
}

int findLocOvertake(int l,int i,int j) // x[i] < x[j]; when will i overtake j on level l
{
	int low = 0;
	int high = levels[l].size();
	while(low != high)
	{
		int mid = (low+high)/2;
		if(cost(i, levels[l][mid]) < cost(j, levels[l][mid])) high = mid;
		else low = mid+1;
	}
	return low;
}

int firstDom[MAXN];
int lastDom[MAXN];

int findFirst(int l,int i)	//for i in level l-1, first point in level l dominating i; -1 if none
{
	int low = 0;
	int high = levels[l].size()-1;
	while(low != high)
	{
		int mid = (low + high)/2;
		if(x[levels[l][mid]] > x[i]) high = mid;
		else low = mid+1;
	}
	if(x[levels[l][low]] > x[i] && y[levels[l][low]] > y[i])
		return low;
	return -1;
}

int findLast(int l,int i) //for i in level l-1, last point in level l dominating i; -1 if none
{
	int low = 0;
	int high = levels[l].size()-1;
	while(low != high)
	{
		int mid = (low + high + 1)/2;
		if(y[levels[l][mid]] > y[i]) low = mid;
		else high = mid-1;
	}
	if(x[levels[l][low]] > x[i] && y[levels[l][low]] > y[i])
		return low;
	return -1;
}

int que[MAXN];
int overtake[MAXN];
vector<int> level;

void solveStartingRegion(int l, int iStart, int iEnd, int qStart, int qEnd) //intervals all start in [qStart, qEnd] and end at qEnd or later
{
	int len = 0;
	int i = iStart;
	for(int j=qStart;j<=qEnd;j++)
	{
		int q = levels[l+1][j];
		while(i <= iEnd && firstDom[i] <= j)
		{
			while(len >= 2 && overtake[len-2] <= findLocOvertake(l+1, que[len-1], level[i]))
				len--;
			que[len] = level[i];
			if(len >= 1)
				overtake[len-1] = findLocOvertake(l+1, que[len-1], level[i]);
			len++;
			i++;
		}
		while(len >= 2 && overtake[len-2] <= j)
			len--;
		dp[q] = min(dp[q], cost(que[len-1], q));
	}
}

void solveEndingRegion(int l, int iStart, int iEnd, int qStart, int qEnd)	//intervals all start at qStart or before, and end in [qStart, qEnd]
{
	int len = 0;
	int i = iEnd;
	for(int j=qEnd;j>=qStart;j--)
	{
		int q = levels[l+1][j];
		while(i >= iStart && lastDom[i] >= j)
		{
			while(len >= 2 && overtake[len-2] >= findLocOvertake(l+1, level[i], que[len-1]))
				len--;
			que[len] = level[i];
			if(len >= 1)
				overtake[len-1] = findLocOvertake(l+1, level[i], que[len-1]);
			len++;
			i--;
		}
		while(len >= 2 && overtake[len-2] > j)
			len--;
		dp[q] = min(dp[q], cost(que[len-1], q));
	}
}

int main()
{
	cin >> N >> L;
	for(int i=0;i<N;i++)
		cin >> x[i] >> y[i];
	x[N] = y[N] = L;
	N++;
	for(int i=0;i<N;i++)
		dp[i] = INF, xid[i] = i;
	
	sort(xid,xid+N,cmpx);
	
	computeLIS();
	for(int i=0;i<levels[0].size();i++)
	{
		int cur = levels[0][i];
		dp[cur] = x[cur]*((long long)y[cur]);
	}
	for(int l=0;levels[l+1].size()>0;l++)
	{
		level.clear();
		for(int i=0;i<levels[l].size();i++)
		{
			int cur = levels[l][i];
			int fd = findFirst(l+1, cur);
			int ld = findLast(l+1, cur);
			if(fd != -1)	//must eliminate points in levels[l] not dominated by any points in levels[l+1]
			{
				firstDom[level.size()] = fd;
				lastDom[level.size()] = ld;
				level.push_back(levels[l][i]);
			}
		}
		for(int i=0;i<level.size();)
		{
			int iEnd = i;
			while(iEnd + 1 < level.size() && firstDom[iEnd + 1] <= lastDom[i])
				iEnd++;
			solveStartingRegion(l, i, iEnd, firstDom[i], lastDom[i]);
			if(lastDom[iEnd] >= lastDom[i] + 1)
				solveEndingRegion(l, i+1, iEnd, lastDom[i] + 1, lastDom[iEnd]);
			i = iEnd + 1;
		}
	}
	cout << dp[N-1] << '\n';
}