The right option is probably to use a container that does the work for you, like std::vector.
new and delete cannot resize, because they allocate just enough memory to hold an object of the given type. The size of a given type will never change. There are new[] and delete[] but there's hardly ever a reason to use them.
What realloc does in C is likely to be just a malloc, memcpy and free, anyway, although memory managers are allowed to do something clever if there is enough contiguous free memory available.
Resizing in C++ is awkward because of the potential need to call constructors and destructors.
I don't think there's a fundamental reason why in C++ you couldn't have a resize[] operator to go with new[] and delete[], that did something similar to this:
Obviously oldsize would be retrieved from a secret location, same is it is in delete[], and Type would come from the type of the operand. resize[] would fail where the Type is not copyable - which is correct, since such objects simply cannot be relocated. Finally, the above code default-constructs the objects before assigning them, which you would not want as the actual behaviour.
There's a possible optimisation where newsize <= oldsize, to call destructors for the objects "past the end" of the newly-ensmallened array and do nothing else. The standard would have to define whether this optimisation is required (as when you resize() a vector), permitted but unspecified, permitted but implementation-dependent, or forbidden.
The question you should then ask yourself is, "is it actually useful to provide this, given that vector also does it, and is designed specifically to provide a resize-able container (of contiguous memory--that requirement omitted in C++98 but fixed in C++03) that's a better fit than arrays with the C++ ways of doing things?"
I think the answer is widely thought to be "no". If you want to do resizeable buffers the C way, use malloc / free / realloc, which are available in C++. If you want to do resizeable buffers the C++ way, use a vector (or deque, if you don't actually need contiguous storage). Don't try to mix the two by using new[] for raw buffers, unless you're implementing a vector-like container.
Here's a std::move example implementing a simple vector with a realloc (*2 each time we hit the limit). If there's a way to do better than the copy I have below, pls let me know.
Compile as:
g++ -std=c++2a -O2 -Wall -pedantic foo.cpp
Code:
#include <iostream>
#include <algorithm>
template<class T> class MyVector {
private:
T *data;
size_t maxlen;
size_t currlen;
public:
MyVector<T> () : data (nullptr), maxlen(0), currlen(0) { }
MyVector<T> (int maxlen) : data (new T [maxlen]), maxlen(maxlen), currlen(0) { }
MyVector<T> (const MyVector& o) {
std::cout << "copy ctor called" << std::endl;
data = new T [o.maxlen];
maxlen = o.maxlen;
currlen = o.currlen;
std::copy(o.data, o.data + o.maxlen, data);
}
MyVector<T> (const MyVector<T>&& o) {
std::cout << "move ctor called" << std::endl;
data = o.data;
maxlen = o.maxlen;
currlen = o.currlen;
}
void push_back (const T& i) {
if (currlen >= maxlen) {
maxlen *= 2;
auto newdata = new T [maxlen];
std::copy(data, data + currlen, newdata);
if (data) {
delete[] data;
}
data = newdata;
}
data[currlen++] = i;
}
friend std::ostream& operator<<(std::ostream &os, const MyVector<T>& o) {
auto s = o.data;
auto e = o.data + o.currlen;;
while (s < e) {
os << "[" << *s << "]";
s++;
}
return os;
}
};
int main() {
auto c = new MyVector<int>(1);
c->push_back(10);
c->push_back(11);
}