Dismiss
Announcing Stack Overflow Documentation

We started with Q&A. Technical documentation is next, and we need your help.

Whether you're a beginner or an experienced developer, you can contribute.

Sign up and start helping → Learn more about Documentation →

So one of the topics in my comp sci class is concerning time complexity and using arrays and linked lists as a good way to compare certain operations and what container is better at doing so, so you can choose the appropiate data structure. I understand the reasoning behind most of the operations but I'm unsure about one and that is inserting and appending in an array.

The worst case scenario for both of these is O(n). I believe I understand why inserting is O(n) because worst case, you insert at the front causing you to shift all elements over to the right meaning that its linear and dependent on the number of elements in the array. For appending, I was curious why it was not O(1) since it takes one operation no matter the size to add an element at the end, given that there is space.

Is that the issue, if there isn't enough space you have to copy the array to a larger one for its worst case scenario?

share|improve this question
    
Define array. – Josué Oct 7 '13 at 21:47
    
depends on the tasks, insert/delete ~= linked list, sort/computation ~=arrays – technosaurus Oct 7 '13 at 21:50
up vote 1 down vote accepted

[...] if there isn't enough space you have to copy the array to a larger one for its worst case scenario?

Bingo.

share|improve this answer

A typical array is a chunk of contiguous memory with a definite size, which is determined either at compile or run time. There is no such thing as removing or inserting elements into an array, but simply writing into the already-allocated memory.

A linked list is a non-contiguous collection of memory chunks, which are connected by means of their addresses. There is such a thing as removing and inserting elements into a linked list.

The benefits of an array over a linked list are easier traversal and compactness (extra memory to store the address of the next [or previous] element is unnecessary). However, unlike a linked list, this cannot be extended as easily.

Nevertheless, in order for us to more precisely talk about the time complexities of the algorithms inherent to a data structure, we need to first define the data structure.

Doubly linked lists? Do we store the addresses of the first and last elements (like a queue)? Binary trees (which are a type of linked list)?

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.