Method for dynamically optimizing
bandwidth allocation in a
variable bitrate conference environment. Conference means with two or more outputs are provided, where each one can output data at different rates, in order to support two or more endpoints which may have different media rates. Two or more endpoints are connected to these conference means for participating in the conference. Whenever more than one
video rate is used by participants during the conference, each set of output rates is selected from all possible combinations of output rates in the conference means, wherein the lowest output rate in each selected set is the
entry rate of the endpoint joining the conference at the lowest rate. A Quality Drop Coefficient (QDC) for each endpoint that joins the conference is determined for each selected set, wherein the QDC is computed according to the endpoint
entry rate and the highest rate, among the output rates of each selected set, that is lower or equal to said endpoints'
entry rate. A Quality Drop Value (QDV) is calculated for each of the selected sets, wherein, preferably, the set of output rates with the lowest QDV is determined as the optimal
video rate set to select. The
video rate of all the endpoints having a video rate above the highest optimal video rate is reduced to the highest optimal video rate, if required, and the video rate of other endpoints having video rate between two consecutive levels of optimal video rates is reduced to the lowest level among said levels. Whenever a change occurs in either the amount of participating endpoints in the conference or in the declared
bit rate capability of the participating endpoints, the video rates of all the outputs are recalculated.