Term to express a range of fluctuation

In summary: Image Recognition?In summary, the term the user is looking for is a "user-defined range of percentage of rescaling," which is used in image recognition software to specify the range of resizing allowed when comparing a memorized image to a new image. This range, also known as the "scale factor," represents the upper limit of fluctuation allowance in image size and can include scaling anywhere between 100-130% of the original image size. While there may be other factors that contribute to size fluctuation, such as rotation or distortion, this term specifically refers to variations in size caused by the distance between the camera and object. Additional resources on image recognition may provide more specific terminology.
  • #1
Jun Kyoto
14
1
I am trying to make a term for a function equipped on an image sensor.
The term is to express "the upper limit of fluctuation allowance in image size which is specified in %"

The value of percentage does not express the ratio of the enlarged image size compared to the original image size, so for example, 150% does not mean that the sensor will only detect a 150% bigger image to the original image.

Instead, it is to express the range of percentage of size fluctuation for sensor to accept to detect the shape (image) which makes 150% to mean that the sensor will detect an upscaled original image in all upscaling rate from 101% to 150%, such as a 102%, 117%, or 142% bigger image to the original image.

Is any of the followings describe the concept well?
If not, what is the problem?

>- Maximum size fluctuation allowance percentage
- Maximum size volatility allowance percentage

Or,
>- Size fluctuation allowance percentage upper limit
- Size volatility allowance percentage upper limit

Or, maybe "percentage" better be "rate"?
Also, all of the candidates seem redundant. Any term to combine some words?Thank you in advance.
 
Technology news on Phys.org
  • #2
I think you are looking for the term 'scale factor'. Most APIs I've seen would express '150%' as a scale factor of '1.5'.
 
  • Like
Likes Jun Kyoto
  • #3
Jun Kyoto said:
I am trying to make a term for a function equipped on an image sensor.
The term is to express "the upper limit of fluctuation allowance in image size which is specified in %"

The value of percentage does not express the ratio of the enlarged image size compared to the original image size, so for example, 150% does not mean that the sensor will only detect a 150% bigger image to the original image.

Instead, it is to express the range of percentage of size fluctuation for sensor to accept to detect the shape (image) which makes 150% to mean that the sensor will detect an upscaled original image in all upscaling rate from 101% to 150%, such as a 102%, 117%, or 142% bigger image to the original image.

Is any of the followings describe the concept well?
If not, what is the problem?

>- Maximum size fluctuation allowance percentage
- Maximum size volatility allowance percentage

Or,
>- Size fluctuation allowance percentage upper limit
- Size volatility allowance percentage upper limit

Or, maybe "percentage" better be "rate"?
Also, all of the candidates seem redundant. Any term to combine some words?Thank you in advance.

I'm having a hard time understanding what you are asking about. Could you perhaps give a concrete example of what you are asking about? Do you have a camera system and image processing software in mind for this question? Can you post some images that illustrate what you are asking about? Thank you.
 
  • #4
ScottSalley said:
I think you are looking for the term 'scale factor'. Most APIs I've seen would express '150%' as a scale factor of '1.5'.

Thank you ScottSalley,
Though, when I look up "scale factor" on Wikipedia, it says it is "coefficient". Does it also imply "range"? Because the substance of the concept of what I am trying to explain is "limit of range (allowance)".
 
  • #5
berkeman said:
Im having a hard time understanding what you are asking about. Could you perhaps give a concrete example of what you are asking about? Do you have a camera system and image processing software in mind for this question? Can you post some images that illustrate what you are asking about? Thank you.

Thank you berkeman, and sorry for the lack of context.

Yes, I do have a camera system and image processing software in mind. It is about an image sensor..

The software memorizes a sample image and then can inspect the presence of the image in another image. To find the sample image when it is a part of another image and is smaller or bigger than the size it was at the memorization, the software rescales the image internally.

User can specify the range of the rescaling by percentage because he or she maybe only wants to find the image no bigger than 130%. And what's important here is that when the user specifies 130%, it includes scaling anywhere between 100-130% (images in 101%, 122%, and 130% would all be detected.). This "user-defined range of percentage of rescaling" is what I am looking for a term for.
 
Last edited:
  • #6
Jun Kyoto said:
Thank you berkeman, and sorry for the lack of context.

Yes, I do have a camera system and image processing software in mind. The software memorizes a sample image and then can inspect the presence of the image in another image. To find the sample image when it is a part of another image and is smaller or bigger than the size it was at the memorization, the software rescales the image internally. User can specify the range of the rescaling by percentage because he or she maybe only wants to find the image no bigger than 130%. This "user-defined range of percentage of rescaling" is what I am looking for a term for.

Ah, that helps a lot. Can there be any rotation of the object, or other distortion other than just size? Is the variation in size due to the object being farther away from or closer to the camera than the original memorized image of the object? Have you looked through "Image Recognition" software documentation to see if the term you are looking for is there?
 
  • Like
Likes Jun Kyoto
  • #7
berkeman said:
Ah, that helps a lot. Can there be any rotation of the object, or other distortion other than just size? Is the variation in size due to the object being farther away from or closer to the camera than the original memorized image of the object? Have you looked through "Image Recognition" software documentation to see if the term you are looking for is there?
Yes, the size fluctuation we are talking about here is mainly caused by varying distance between camera and object. Although it may include that caused by other distortions. I haven't look through image recognition documentations. i should. Do you have any good online resource in mind??
 
  • #8
Maybe, "scale factor range"? Researching farther, though
 
  • #9
I think maximum/minimum scaling factor will do.
 
  • Like
Likes berkeman

Related to Term to express a range of fluctuation

1. What is a term to express a range of fluctuation?

A term commonly used to express a range of fluctuation is "variance." It is a statistical measure that describes how spread out a set of data is.

2. How is variance calculated?

Variance is calculated by taking the average of the squared differences from the mean of a set of data. It is represented by the symbol σ² or s², depending on whether the data is a population or a sample.

3. What does a high variance indicate?

A high variance indicates that the data points are spread out over a wider range, while a low variance indicates that the data points are clustered closer together.

4. How is variance interpreted?

Variance is interpreted as the average deviation of a data set from its mean. A larger variance means that the data is more spread out and less consistent, while a smaller variance means that the data is more tightly clustered and consistent.

5. What is the relationship between variance and standard deviation?

Variance and standard deviation are closely related, as standard deviation is the square root of variance. Standard deviation is often preferred as a measure of variability because it is in the same unit as the original data, making it easier to interpret and compare between different data sets.

Similar threads

Replies
13
Views
2K
  • Cosmology
Replies
4
Views
2K
  • Sci-Fi Writing and World Building
Replies
21
Views
1K
  • Programming and Computer Science
Replies
1
Views
1K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
3
Views
2K
  • Astronomy and Astrophysics
Replies
34
Views
12K
Replies
9
Views
2K
  • Beyond the Standard Models
Replies
10
Views
2K
  • Beyond the Standard Models
Replies
2
Views
2K
Back
Top