I have a baby problem but I'm stuck on it. Anyone?? Write a program in any language that takes a single integer array parameter and returns the decimal average of the input values.
The first question is which language have you chosen? Second question is what are you stuck on?
Python
I'm assuming you already know some python since you haven't mentioned where you are stuck. You'll need a function that takes a parameter Check if it's a list Initialize a counter for the sum to 0 Begin a loop where you add everything from the array into your counter Figure out the length of the array and find the average The one problem (and probably the reason for this excercise) is in python 2 int/int will return int (python 3 seems to return float), truncating anything after the decimal place. The specification lists that you want a decimal average. Since I'm not overly fluent in python I recommend doing what I would do, google python float cast or something similar. Probably just having one of them turned into a float should do it. Return the result.
I'll write a python solution since it looks like pseudocode :-D def arrayAverage(A): n = len(A) # Stores the length of the array. In C, you may want to supply sum = 0 # int n as an additional parameter. for i in xrange(10): # iterates through all indices of A, beginning at 0 and ending at sum += A[i] # n-1, but in Python you can iterate through the elements of A. return float(sum) / n
You can also do this in python, which seems more Pythonic: def arrayAverage(A): n = len(A) sum = 0 for number in A: sum += number return float(sum) / n
Join our real-time social learning platform and learn together with your friends!